2026-03-08T22:37:38.817 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:37:38.820 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:37:38.840 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/278 branch: squid description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/misc} email: null first_in_suite: false flavor: default job_id: '278' last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath selinux: allowlist: - scontext=system_u:system_r:getty_t:s0 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm07.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMXtiQKlPXHJBhGWwMl2/Us7grqrblTeIxXuetZ51BJhfyq9tPRIh9drUGzGkLYTTATIhIpJd1rPkMNXjkWi+rw= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - misc teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:37:38.841 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:37:38.841 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:37:38.841 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:37:38.842 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:37:38.842 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:37:38.842 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:37:38.842 INFO:teuthology.packaging:ref: None 2026-03-08T22:37:38.842 INFO:teuthology.packaging:tag: None 2026-03-08T22:37:38.842 INFO:teuthology.packaging:branch: squid 2026-03-08T22:37:38.842 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:37:38.842 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-08T22:37:39.585 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-08T22:37:39.586 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:37:39.587 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:37:39.587 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:37:39.588 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:37:39.592 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:37:39.593 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:37:39.600 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm07.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/278', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:36:56.465056', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:07', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMXtiQKlPXHJBhGWwMl2/Us7grqrblTeIxXuetZ51BJhfyq9tPRIh9drUGzGkLYTTATIhIpJd1rPkMNXjkWi+rw='} 2026-03-08T22:37:39.600 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:37:39.601 INFO:teuthology.task.internal:roles: ubuntu@vm07.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:37:39.601 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:37:39.608 DEBUG:teuthology.task.console_log:vm07 does not support IPMI; excluding 2026-03-08T22:37:39.608 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7fbec2b00670>, signals=[15]) 2026-03-08T22:37:39.608 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:37:39.609 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:37:39.609 DEBUG:teuthology.task.internal:connecting to ubuntu@vm07.local 2026-03-08T22:37:39.610 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:37:39.671 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:37:39.672 DEBUG:teuthology.orchestra.run.vm07:> uname -m 2026-03-08T22:37:39.847 INFO:teuthology.orchestra.run.vm07.stdout:x86_64 2026-03-08T22:37:39.848 DEBUG:teuthology.orchestra.run.vm07:> cat /etc/os-release 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:NAME="CentOS Stream" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:VERSION="9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:ID="centos" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:ID_LIKE="rhel fedora" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:VERSION_ID="9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:PLATFORM_ID="platform:el9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:ANSI_COLOR="0;31" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:LOGO="fedora-logo-icon" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:HOME_URL="https://centos.org/" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-08T22:37:39.903 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-08T22:37:39.904 INFO:teuthology.lock.ops:Updating vm07.local on lock server 2026-03-08T22:37:39.909 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:37:39.911 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:37:39.912 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:37:39.912 DEBUG:teuthology.orchestra.run.vm07:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:37:39.962 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:37:39.963 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:37:39.963 DEBUG:teuthology.orchestra.run.vm07:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:37:40.020 INFO:teuthology.orchestra.run.vm07.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:37:40.020 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:37:40.029 DEBUG:teuthology.orchestra.run.vm07:> test -e /ceph-qa-ready 2026-03-08T22:37:40.078 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:37:40.274 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:37:40.275 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:37:40.275 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:37:40.290 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:37:40.292 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:37:40.293 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:37:40.293 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:37:40.350 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:37:40.351 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:37:40.351 DEBUG:teuthology.orchestra.run.vm07:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:37:40.404 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:37:40.404 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:37:40.472 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:37:40.484 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:37:40.485 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:37:40.487 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:37:40.487 DEBUG:teuthology.orchestra.run.vm07:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:37:40.557 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:37:40.559 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:37:40.560 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:37:40.616 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:37:40.682 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:37:40.743 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:37:40.743 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:37:40.805 DEBUG:teuthology.orchestra.run.vm07:> sudo service rsyslog restart 2026-03-08T22:37:40.880 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T22:37:41.333 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:37:41.336 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:37:41.336 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:37:41.339 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:37:41.342 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:getty_t:s0']} 2026-03-08T22:37:41.342 INFO:teuthology.task.selinux:Excluding vm07: VMs are not yet supported 2026-03-08T22:37:41.342 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:37:41.342 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:37:41.342 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:37:41.342 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:37:41.345 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:37:41.345 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:37:41.351 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:37:41.351 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventory5suoox5o --limit vm07.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:39:14.751 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm07.local')] 2026-03-08T22:39:14.751 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm07.local' 2026-03-08T22:39:14.751 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:39:14.819 DEBUG:teuthology.orchestra.run.vm07:> true 2026-03-08T22:39:14.891 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm07.local' 2026-03-08T22:39:14.891 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:39:14.894 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:39:14.894 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:39:14.894 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:39:14.974 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-08T22:39:14.993 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-08T22:39:15.026 INFO:teuthology.orchestra.run.vm07.stderr:sudo: ntpd: command not found 2026-03-08T22:39:15.039 INFO:teuthology.orchestra.run.vm07.stdout:506 Cannot talk to daemon 2026-03-08T22:39:15.060 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-08T22:39:15.086 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-08T22:39:15.134 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-08T22:39:15.137 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T22:39:15.137 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-08T22:39:15.138 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:39:15.140 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:39:15.140 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:39:15.140 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:39:15.140 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:39:15.142 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:39:15.142 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:39:15.142 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': [], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-08T22:39:15.143 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:39:15.786 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/ 2026-03-08T22:39:15.786 INFO:teuthology.task.install.rpm:Package version is 19.2.3-678.ge911bdeb 2026-03-08T22:39:16.341 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-08T22:39:16.341 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:39:16.341 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-08T22:39:16.376 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-08T22:39:16.376 DEBUG:teuthology.orchestra.run.vm07:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;sha1/e911bdebe5c8faa3800735d1568fcdca65db60df/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-08T22:39:16.445 DEBUG:teuthology.orchestra.run.vm07:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-08T22:39:16.534 DEBUG:teuthology.orchestra.run.vm07:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-08T22:39:16.569 INFO:teuthology.orchestra.run.vm07.stdout:check_obsoletes = 1 2026-03-08T22:39:16.571 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-08T22:39:16.802 INFO:teuthology.orchestra.run.vm07.stdout:41 files removed 2026-03-08T22:39:16.837 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-08T22:39:18.271 INFO:teuthology.orchestra.run.vm07.stdout:ceph packages for x86_64 68 kB/s | 84 kB 00:01 2026-03-08T22:39:19.257 INFO:teuthology.orchestra.run.vm07.stdout:ceph noarch packages 12 kB/s | 12 kB 00:00 2026-03-08T22:39:20.259 INFO:teuthology.orchestra.run.vm07.stdout:ceph source packages 1.9 kB/s | 1.9 kB 00:00 2026-03-08T22:39:21.563 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - BaseOS 6.9 MB/s | 8.9 MB 00:01 2026-03-08T22:39:23.778 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - AppStream 19 MB/s | 27 MB 00:01 2026-03-08T22:39:34.782 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - CRB 1.0 MB/s | 8.0 MB 00:07 2026-03-08T22:39:36.010 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - Extras packages 51 kB/s | 20 kB 00:00 2026-03-08T22:39:36.486 INFO:teuthology.orchestra.run.vm07.stdout:Extra Packages for Enterprise Linux 52 MB/s | 20 MB 00:00 2026-03-08T22:39:41.263 INFO:teuthology.orchestra.run.vm07.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-08T22:39:42.644 INFO:teuthology.orchestra.run.vm07.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:39:42.644 INFO:teuthology.orchestra.run.vm07.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T22:39:42.649 INFO:teuthology.orchestra.run.vm07.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-08T22:39:42.649 INFO:teuthology.orchestra.run.vm07.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-08T22:39:42.679 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout:====================================================================================== 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout:====================================================================================== 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout:Installing: 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 6.5 k 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.5 M 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.2 M 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 145 k 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.1 M 2026-03-08T22:39:42.683 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 150 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 3.8 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 7.4 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 49 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 11 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 50 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 299 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 769 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 34 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 1.0 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 127 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 165 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 323 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 303 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 100 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 85 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.1 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 171 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout:Upgrading: 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.4 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 3.2 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout:Installing dependencies: 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: abseil-cpp x86_64 20211102.0-4.el9 epel 551 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 22 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 31 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 2.4 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 253 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 4.7 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 17 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 ceph-noarch 17 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 25 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: cryptsetup x86_64 2.8.1-3.el9 baseos 351 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: grpc-data noarch 1.46.7-10.el9 epel 19 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 163 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libnbd x86_64 1.20.3-4.el9 appstream 164 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 503 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 5.4 M 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-08T22:39:42.684 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: lua x86_64 5.4.4-4.el9 appstream 188 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: lua-devel x86_64 5.4.4-4.el9 crb 22 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: luarocks noarch 3.9.2-5.el9 epel 151 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: pciutils x86_64 3.7.0-7.el9 baseos 93 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: protobuf x86_64 3.14.0-17.el9 appstream 1.0 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-compiler x86_64 3.14.0-17.el9 crb 862 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 45 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 ceph 142 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio x86_64 1.46.7-10.el9 epel 2.0 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 epel 144 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-packaging noarch 20.9-5.el9 appstream 77 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-protobuf noarch 3.14.0-17.el9 appstream 267 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyparsing noarch 2.4.7-9.el9 baseos 150 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-08T22:39:42.685 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: qatlib x86_64 25.08.0-2.el9 appstream 240 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: qatzip-libs x86_64 1.3.1-1.el9 appstream 66 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: unzip x86_64 6.0-59.el9 baseos 182 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: zip x86_64 3.0-35.el9 baseos 266 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Installing weak dependencies: 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-service x86_64 25.08.0-2.el9 appstream 37 k 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:====================================================================================== 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Install 135 Packages 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Upgrade 2 Packages 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Total download size: 210 M 2026-03-08T22:39:42.686 INFO:teuthology.orchestra.run.vm07.stdout:Downloading Packages: 2026-03-08T22:39:44.180 INFO:teuthology.orchestra.run.vm07.stdout:(1/137): ceph-19.2.3-678.ge911bdeb.el9.x86_64.r 14 kB/s | 6.5 kB 00:00 2026-03-08T22:39:44.984 INFO:teuthology.orchestra.run.vm07.stdout:(2/137): ceph-fuse-19.2.3-678.ge911bdeb.el9.x86 1.4 MB/s | 1.2 MB 00:00 2026-03-08T22:39:45.101 INFO:teuthology.orchestra.run.vm07.stdout:(3/137): ceph-immutable-object-cache-19.2.3-678 1.2 MB/s | 145 kB 00:00 2026-03-08T22:39:45.272 INFO:teuthology.orchestra.run.vm07.stdout:(4/137): ceph-base-19.2.3-678.ge911bdeb.el9.x86 3.5 MB/s | 5.5 MB 00:01 2026-03-08T22:39:45.351 INFO:teuthology.orchestra.run.vm07.stdout:(5/137): ceph-mds-19.2.3-678.ge911bdeb.el9.x86_ 9.7 MB/s | 2.4 MB 00:00 2026-03-08T22:39:45.405 INFO:teuthology.orchestra.run.vm07.stdout:(6/137): ceph-mgr-19.2.3-678.ge911bdeb.el9.x86_ 8.2 MB/s | 1.1 MB 00:00 2026-03-08T22:39:45.781 INFO:teuthology.orchestra.run.vm07.stdout:(7/137): ceph-mon-19.2.3-678.ge911bdeb.el9.x86_ 11 MB/s | 4.7 MB 00:00 2026-03-08T22:39:46.395 INFO:teuthology.orchestra.run.vm07.stdout:(8/137): ceph-common-19.2.3-678.ge911bdeb.el9.x 8.1 MB/s | 22 MB 00:02 2026-03-08T22:39:46.529 INFO:teuthology.orchestra.run.vm07.stdout:(9/137): ceph-selinux-19.2.3-678.ge911bdeb.el9. 187 kB/s | 25 kB 00:00 2026-03-08T22:39:46.671 INFO:teuthology.orchestra.run.vm07.stdout:(10/137): ceph-radosgw-19.2.3-678.ge911bdeb.el9 12 MB/s | 11 MB 00:00 2026-03-08T22:39:46.731 INFO:teuthology.orchestra.run.vm07.stdout:(11/137): ceph-osd-19.2.3-678.ge911bdeb.el9.x86 13 MB/s | 17 MB 00:01 2026-03-08T22:39:46.791 INFO:teuthology.orchestra.run.vm07.stdout:(12/137): libcephfs-devel-19.2.3-678.ge911bdeb. 281 kB/s | 34 kB 00:00 2026-03-08T22:39:46.908 INFO:teuthology.orchestra.run.vm07.stdout:(13/137): libcephfs2-19.2.3-678.ge911bdeb.el9.x 5.5 MB/s | 1.0 MB 00:00 2026-03-08T22:39:46.910 INFO:teuthology.orchestra.run.vm07.stdout:(14/137): libcephsqlite-19.2.3-678.ge911bdeb.el 1.4 MB/s | 163 kB 00:00 2026-03-08T22:39:47.027 INFO:teuthology.orchestra.run.vm07.stdout:(15/137): librados-devel-19.2.3-678.ge911bdeb.e 1.0 MB/s | 127 kB 00:00 2026-03-08T22:39:47.035 INFO:teuthology.orchestra.run.vm07.stdout:(16/137): libradosstriper1-19.2.3-678.ge911bdeb 3.9 MB/s | 503 kB 00:00 2026-03-08T22:39:47.162 INFO:teuthology.orchestra.run.vm07.stdout:(17/137): python3-ceph-argparse-19.2.3-678.ge91 356 kB/s | 45 kB 00:00 2026-03-08T22:39:47.280 INFO:teuthology.orchestra.run.vm07.stdout:(18/137): python3-ceph-common-19.2.3-678.ge911b 1.2 MB/s | 142 kB 00:00 2026-03-08T22:39:47.396 INFO:teuthology.orchestra.run.vm07.stdout:(19/137): python3-cephfs-19.2.3-678.ge911bdeb.e 1.4 MB/s | 165 kB 00:00 2026-03-08T22:39:47.527 INFO:teuthology.orchestra.run.vm07.stdout:(20/137): librgw2-19.2.3-678.ge911bdeb.el9.x86_ 11 MB/s | 5.4 MB 00:00 2026-03-08T22:39:47.528 INFO:teuthology.orchestra.run.vm07.stdout:(21/137): python3-rados-19.2.3-678.ge911bdeb.el 2.4 MB/s | 323 kB 00:00 2026-03-08T22:39:47.643 INFO:teuthology.orchestra.run.vm07.stdout:(22/137): python3-rgw-19.2.3-678.ge911bdeb.el9. 864 kB/s | 100 kB 00:00 2026-03-08T22:39:47.647 INFO:teuthology.orchestra.run.vm07.stdout:(23/137): python3-rbd-19.2.3-678.ge911bdeb.el9. 2.4 MB/s | 303 kB 00:00 2026-03-08T22:39:47.759 INFO:teuthology.orchestra.run.vm07.stdout:(24/137): rbd-fuse-19.2.3-678.ge911bdeb.el9.x86 737 kB/s | 85 kB 00:00 2026-03-08T22:39:47.919 INFO:teuthology.orchestra.run.vm07.stdout:(25/137): rbd-nbd-19.2.3-678.ge911bdeb.el9.x86_ 1.0 MB/s | 171 kB 00:00 2026-03-08T22:39:48.095 INFO:teuthology.orchestra.run.vm07.stdout:(26/137): rbd-mirror-19.2.3-678.ge911bdeb.el9.x 7.0 MB/s | 3.1 MB 00:00 2026-03-08T22:39:48.095 INFO:teuthology.orchestra.run.vm07.stdout:(27/137): ceph-grafana-dashboards-19.2.3-678.ge 177 kB/s | 31 kB 00:00 2026-03-08T22:39:48.218 INFO:teuthology.orchestra.run.vm07.stdout:(28/137): ceph-mgr-cephadm-19.2.3-678.ge911bdeb 1.2 MB/s | 150 kB 00:00 2026-03-08T22:39:48.452 INFO:teuthology.orchestra.run.vm07.stdout:(29/137): ceph-mgr-dashboard-19.2.3-678.ge911bd 11 MB/s | 3.8 MB 00:00 2026-03-08T22:39:48.569 INFO:teuthology.orchestra.run.vm07.stdout:(30/137): ceph-mgr-modules-core-19.2.3-678.ge91 2.1 MB/s | 253 kB 00:00 2026-03-08T22:39:48.685 INFO:teuthology.orchestra.run.vm07.stdout:(31/137): ceph-mgr-rook-19.2.3-678.ge911bdeb.el 427 kB/s | 49 kB 00:00 2026-03-08T22:39:48.800 INFO:teuthology.orchestra.run.vm07.stdout:(32/137): ceph-prometheus-alerts-19.2.3-678.ge9 146 kB/s | 17 kB 00:00 2026-03-08T22:39:48.919 INFO:teuthology.orchestra.run.vm07.stdout:(33/137): ceph-volume-19.2.3-678.ge911bdeb.el9. 2.5 MB/s | 299 kB 00:00 2026-03-08T22:39:48.942 INFO:teuthology.orchestra.run.vm07.stdout:(34/137): ceph-mgr-diskprediction-local-19.2.3- 10 MB/s | 7.4 MB 00:00 2026-03-08T22:39:49.047 INFO:teuthology.orchestra.run.vm07.stdout:(35/137): cephadm-19.2.3-678.ge911bdeb.el9.noar 5.9 MB/s | 769 kB 00:00 2026-03-08T22:39:49.113 INFO:teuthology.orchestra.run.vm07.stdout:(36/137): cryptsetup-2.8.1-3.el9.x86_64.rpm 2.0 MB/s | 351 kB 00:00 2026-03-08T22:39:49.157 INFO:teuthology.orchestra.run.vm07.stdout:(37/137): libconfig-1.7.2-9.el9.x86_64.rpm 1.6 MB/s | 72 kB 00:00 2026-03-08T22:39:49.183 INFO:teuthology.orchestra.run.vm07.stdout:(38/137): ledmon-libs-1.1.0-3.el9.x86_64.rpm 299 kB/s | 40 kB 00:00 2026-03-08T22:39:49.249 INFO:teuthology.orchestra.run.vm07.stdout:(39/137): libgfortran-11.5.0-14.el9.x86_64.rpm 8.5 MB/s | 794 kB 00:00 2026-03-08T22:39:49.279 INFO:teuthology.orchestra.run.vm07.stdout:(40/137): mailcap-2.1.49-5.el9.noarch.rpm 1.1 MB/s | 33 kB 00:00 2026-03-08T22:39:49.281 INFO:teuthology.orchestra.run.vm07.stdout:(41/137): libquadmath-11.5.0-14.el9.x86_64.rpm 1.8 MB/s | 184 kB 00:00 2026-03-08T22:39:49.321 INFO:teuthology.orchestra.run.vm07.stdout:(42/137): pciutils-3.7.0-7.el9.x86_64.rpm 2.1 MB/s | 93 kB 00:00 2026-03-08T22:39:49.339 INFO:teuthology.orchestra.run.vm07.stdout:(43/137): python3-cffi-1.14.5-5.el9.x86_64.rpm 4.3 MB/s | 253 kB 00:00 2026-03-08T22:39:49.373 INFO:teuthology.orchestra.run.vm07.stdout:(44/137): python3-ply-3.11-14.el9.noarch.rpm 3.0 MB/s | 106 kB 00:00 2026-03-08T22:39:49.405 INFO:teuthology.orchestra.run.vm07.stdout:(45/137): python3-pycparser-2.20-6.el9.noarch.r 4.3 MB/s | 135 kB 00:00 2026-03-08T22:39:49.455 INFO:teuthology.orchestra.run.vm07.stdout:(46/137): python3-pyparsing-2.4.7-9.el9.noarch. 2.9 MB/s | 150 kB 00:00 2026-03-08T22:39:49.469 INFO:teuthology.orchestra.run.vm07.stdout:(47/137): python3-cryptography-36.0.1-5.el9.x86 8.5 MB/s | 1.2 MB 00:00 2026-03-08T22:39:49.502 INFO:teuthology.orchestra.run.vm07.stdout:(48/137): python3-urllib3-1.26.5-7.el9.noarch.r 6.5 MB/s | 218 kB 00:00 2026-03-08T22:39:49.514 INFO:teuthology.orchestra.run.vm07.stdout:(49/137): python3-requests-2.25.1-10.el9.noarch 2.1 MB/s | 126 kB 00:00 2026-03-08T22:39:49.533 INFO:teuthology.orchestra.run.vm07.stdout:(50/137): unzip-6.0-59.el9.x86_64.rpm 5.8 MB/s | 182 kB 00:00 2026-03-08T22:39:49.583 INFO:teuthology.orchestra.run.vm07.stdout:(51/137): zip-3.0-35.el9.x86_64.rpm 3.8 MB/s | 266 kB 00:00 2026-03-08T22:39:49.829 INFO:teuthology.orchestra.run.vm07.stdout:(52/137): flexiblas-3.0.4-9.el9.x86_64.rpm 120 kB/s | 30 kB 00:00 2026-03-08T22:39:49.949 INFO:teuthology.orchestra.run.vm07.stdout:(53/137): boost-program-options-1.75.0-13.el9.x 250 kB/s | 104 kB 00:00 2026-03-08T22:39:50.006 INFO:teuthology.orchestra.run.vm07.stdout:(54/137): flexiblas-openblas-openmp-3.0.4-9.el9 263 kB/s | 15 kB 00:00 2026-03-08T22:39:50.226 INFO:teuthology.orchestra.run.vm07.stdout:(55/137): libnbd-1.20.3-4.el9.x86_64.rpm 745 kB/s | 164 kB 00:00 2026-03-08T22:39:50.412 INFO:teuthology.orchestra.run.vm07.stdout:(56/137): flexiblas-netlib-3.0.4-9.el9.x86_64.r 5.1 MB/s | 3.0 MB 00:00 2026-03-08T22:39:50.428 INFO:teuthology.orchestra.run.vm07.stdout:(57/137): libpmemobj-1.12.1-1.el9.x86_64.rpm 794 kB/s | 160 kB 00:00 2026-03-08T22:39:50.473 INFO:teuthology.orchestra.run.vm07.stdout:(58/137): librabbitmq-0.11.0-7.el9.x86_64.rpm 747 kB/s | 45 kB 00:00 2026-03-08T22:39:50.534 INFO:teuthology.orchestra.run.vm07.stdout:(59/137): libstoragemgmt-1.10.1-1.el9.x86_64.rp 4.0 MB/s | 246 kB 00:00 2026-03-08T22:39:50.544 INFO:teuthology.orchestra.run.vm07.stdout:(60/137): librdkafka-1.6.1-102.el9.x86_64.rpm 5.6 MB/s | 662 kB 00:00 2026-03-08T22:39:50.591 INFO:teuthology.orchestra.run.vm07.stdout:(61/137): libxslt-1.1.34-12.el9.x86_64.rpm 4.0 MB/s | 233 kB 00:00 2026-03-08T22:39:50.605 INFO:teuthology.orchestra.run.vm07.stdout:(62/137): lttng-ust-2.12.0-6.el9.x86_64.rpm 4.8 MB/s | 292 kB 00:00 2026-03-08T22:39:50.647 INFO:teuthology.orchestra.run.vm07.stdout:(63/137): lua-5.4.4-4.el9.x86_64.rpm 3.3 MB/s | 188 kB 00:00 2026-03-08T22:39:50.662 INFO:teuthology.orchestra.run.vm07.stdout:(64/137): openblas-0.3.29-1.el9.x86_64.rpm 744 kB/s | 42 kB 00:00 2026-03-08T22:39:50.943 INFO:teuthology.orchestra.run.vm07.stdout:(65/137): protobuf-3.14.0-17.el9.x86_64.rpm 3.6 MB/s | 1.0 MB 00:00 2026-03-08T22:39:50.989 INFO:teuthology.orchestra.run.vm07.stdout:(66/137): openblas-openmp-0.3.29-1.el9.x86_64.r 15 MB/s | 5.3 MB 00:00 2026-03-08T22:39:51.047 INFO:teuthology.orchestra.run.vm07.stdout:(67/137): python3-devel-3.9.25-3.el9.x86_64.rpm 4.2 MB/s | 244 kB 00:00 2026-03-08T22:39:51.104 INFO:teuthology.orchestra.run.vm07.stdout:(68/137): python3-jinja2-2.11.3-8.el9.noarch.rp 4.3 MB/s | 249 kB 00:00 2026-03-08T22:39:51.159 INFO:teuthology.orchestra.run.vm07.stdout:(69/137): python3-jmespath-1.0.1-1.el9.noarch.r 868 kB/s | 48 kB 00:00 2026-03-08T22:39:51.215 INFO:teuthology.orchestra.run.vm07.stdout:(70/137): python3-libstoragemgmt-1.10.1-1.el9.x 3.1 MB/s | 177 kB 00:00 2026-03-08T22:39:51.271 INFO:teuthology.orchestra.run.vm07.stdout:(71/137): python3-mako-1.1.4-6.el9.noarch.rpm 3.0 MB/s | 172 kB 00:00 2026-03-08T22:39:51.326 INFO:teuthology.orchestra.run.vm07.stdout:(72/137): python3-markupsafe-1.1.1-12.el9.x86_6 635 kB/s | 35 kB 00:00 2026-03-08T22:39:51.434 INFO:teuthology.orchestra.run.vm07.stdout:(73/137): python3-babel-2.9.1-2.el9.noarch.rpm 12 MB/s | 6.0 MB 00:00 2026-03-08T22:39:51.497 INFO:teuthology.orchestra.run.vm07.stdout:(74/137): python3-numpy-f2py-1.23.5-2.el9.x86_6 6.9 MB/s | 442 kB 00:00 2026-03-08T22:39:51.555 INFO:teuthology.orchestra.run.vm07.stdout:(75/137): python3-packaging-20.9-5.el9.noarch.r 1.3 MB/s | 77 kB 00:00 2026-03-08T22:39:51.616 INFO:teuthology.orchestra.run.vm07.stdout:(76/137): python3-protobuf-3.14.0-17.el9.noarch 4.3 MB/s | 267 kB 00:00 2026-03-08T22:39:51.675 INFO:teuthology.orchestra.run.vm07.stdout:(77/137): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.6 MB/s | 157 kB 00:00 2026-03-08T22:39:51.717 INFO:teuthology.orchestra.run.vm07.stdout:(78/137): python3-numpy-1.23.5-2.el9.x86_64.rpm 16 MB/s | 6.1 MB 00:00 2026-03-08T22:39:51.735 INFO:teuthology.orchestra.run.vm07.stdout:(79/137): python3-pyasn1-modules-0.4.8-7.el9.no 4.5 MB/s | 277 kB 00:00 2026-03-08T22:39:51.772 INFO:teuthology.orchestra.run.vm07.stdout:(80/137): python3-requests-oauthlib-1.3.0-12.el 979 kB/s | 54 kB 00:00 2026-03-08T22:39:51.827 INFO:teuthology.orchestra.run.vm07.stdout:(81/137): python3-toml-0.10.2-6.el9.noarch.rpm 762 kB/s | 42 kB 00:00 2026-03-08T22:39:51.895 INFO:teuthology.orchestra.run.vm07.stdout:(82/137): qatlib-25.08.0-2.el9.x86_64.rpm 3.5 MB/s | 240 kB 00:00 2026-03-08T22:39:51.949 INFO:teuthology.orchestra.run.vm07.stdout:(83/137): qatlib-service-25.08.0-2.el9.x86_64.r 688 kB/s | 37 kB 00:00 2026-03-08T22:39:52.004 INFO:teuthology.orchestra.run.vm07.stdout:(84/137): qatzip-libs-1.3.1-1.el9.x86_64.rpm 1.2 MB/s | 66 kB 00:00 2026-03-08T22:39:52.061 INFO:teuthology.orchestra.run.vm07.stdout:(85/137): socat-1.7.4.1-8.el9.x86_64.rpm 5.2 MB/s | 303 kB 00:00 2026-03-08T22:39:52.116 INFO:teuthology.orchestra.run.vm07.stdout:(86/137): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.1 MB/s | 64 kB 00:00 2026-03-08T22:39:52.427 INFO:teuthology.orchestra.run.vm07.stdout:(87/137): ceph-test-19.2.3-678.ge911bdeb.el9.x8 8.4 MB/s | 50 MB 00:05 2026-03-08T22:39:52.556 INFO:teuthology.orchestra.run.vm07.stdout:(88/137): lua-devel-5.4.4-4.el9.x86_64.rpm 51 kB/s | 22 kB 00:00 2026-03-08T22:39:52.569 INFO:teuthology.orchestra.run.vm07.stdout:(89/137): abseil-cpp-20211102.0-4.el9.x86_64.rp 43 MB/s | 551 kB 00:00 2026-03-08T22:39:52.575 INFO:teuthology.orchestra.run.vm07.stdout:(90/137): gperftools-libs-2.9.1-3.el9.x86_64.rp 51 MB/s | 308 kB 00:00 2026-03-08T22:39:52.577 INFO:teuthology.orchestra.run.vm07.stdout:(91/137): grpc-data-1.46.7-10.el9.noarch.rpm 8.4 MB/s | 19 kB 00:00 2026-03-08T22:39:52.643 INFO:teuthology.orchestra.run.vm07.stdout:(92/137): libarrow-9.0.0-15.el9.x86_64.rpm 68 MB/s | 4.4 MB 00:00 2026-03-08T22:39:52.646 INFO:teuthology.orchestra.run.vm07.stdout:(93/137): libarrow-doc-9.0.0-15.el9.noarch.rpm 9.5 MB/s | 25 kB 00:00 2026-03-08T22:39:52.648 INFO:teuthology.orchestra.run.vm07.stdout:(94/137): liboath-2.6.12-1.el9.x86_64.rpm 18 MB/s | 49 kB 00:00 2026-03-08T22:39:52.652 INFO:teuthology.orchestra.run.vm07.stdout:(95/137): libunwind-1.6.2-1.el9.x86_64.rpm 18 MB/s | 67 kB 00:00 2026-03-08T22:39:52.656 INFO:teuthology.orchestra.run.vm07.stdout:(96/137): luarocks-3.9.2-5.el9.noarch.rpm 37 MB/s | 151 kB 00:00 2026-03-08T22:39:52.669 INFO:teuthology.orchestra.run.vm07.stdout:(97/137): parquet-libs-9.0.0-15.el9.x86_64.rpm 67 MB/s | 838 kB 00:00 2026-03-08T22:39:52.680 INFO:teuthology.orchestra.run.vm07.stdout:(98/137): python3-asyncssh-2.13.2-5.el9.noarch. 50 MB/s | 548 kB 00:00 2026-03-08T22:39:52.682 INFO:teuthology.orchestra.run.vm07.stdout:(99/137): python3-autocommand-2.2.2-8.el9.noarc 12 MB/s | 29 kB 00:00 2026-03-08T22:39:52.685 INFO:teuthology.orchestra.run.vm07.stdout:(100/137): python3-backports-tarfile-1.2.0-1.el 22 MB/s | 60 kB 00:00 2026-03-08T22:39:52.688 INFO:teuthology.orchestra.run.vm07.stdout:(101/137): python3-bcrypt-3.2.2-1.el9.x86_64.rp 16 MB/s | 43 kB 00:00 2026-03-08T22:39:52.691 INFO:teuthology.orchestra.run.vm07.stdout:(102/137): python3-cachetools-4.2.4-1.el9.noarc 12 MB/s | 32 kB 00:00 2026-03-08T22:39:52.693 INFO:teuthology.orchestra.run.vm07.stdout:(103/137): python3-certifi-2023.05.07-4.el9.noa 5.9 MB/s | 14 kB 00:00 2026-03-08T22:39:52.698 INFO:teuthology.orchestra.run.vm07.stdout:(104/137): python3-cheroot-10.0.1-4.el9.noarch. 39 MB/s | 173 kB 00:00 2026-03-08T22:39:52.704 INFO:teuthology.orchestra.run.vm07.stdout:(105/137): python3-cherrypy-18.6.1-2.el9.noarch 59 MB/s | 358 kB 00:00 2026-03-08T22:39:52.709 INFO:teuthology.orchestra.run.vm07.stdout:(106/137): python3-google-auth-2.45.0-1.el9.noa 52 MB/s | 254 kB 00:00 2026-03-08T22:39:52.742 INFO:teuthology.orchestra.run.vm07.stdout:(107/137): python3-grpcio-1.46.7-10.el9.x86_64. 61 MB/s | 2.0 MB 00:00 2026-03-08T22:39:52.746 INFO:teuthology.orchestra.run.vm07.stdout:(108/137): python3-grpcio-tools-1.46.7-10.el9.x 38 MB/s | 144 kB 00:00 2026-03-08T22:39:52.749 INFO:teuthology.orchestra.run.vm07.stdout:(109/137): python3-jaraco-8.2.1-3.el9.noarch.rp 4.5 MB/s | 11 kB 00:00 2026-03-08T22:39:52.752 INFO:teuthology.orchestra.run.vm07.stdout:(110/137): python3-jaraco-classes-3.2.1-5.el9.n 7.0 MB/s | 18 kB 00:00 2026-03-08T22:39:52.754 INFO:teuthology.orchestra.run.vm07.stdout:(111/137): python3-jaraco-collections-3.0.0-8.e 10 MB/s | 23 kB 00:00 2026-03-08T22:39:52.757 INFO:teuthology.orchestra.run.vm07.stdout:(112/137): python3-jaraco-context-6.0.1-3.el9.n 7.2 MB/s | 20 kB 00:00 2026-03-08T22:39:52.759 INFO:teuthology.orchestra.run.vm07.stdout:(113/137): python3-jaraco-functools-3.5.0-2.el9 8.4 MB/s | 19 kB 00:00 2026-03-08T22:39:52.762 INFO:teuthology.orchestra.run.vm07.stdout:(114/137): python3-jaraco-text-4.0.0-2.el9.noar 10 MB/s | 26 kB 00:00 2026-03-08T22:39:52.782 INFO:teuthology.orchestra.run.vm07.stdout:(115/137): python3-kubernetes-26.1.0-3.el9.noar 53 MB/s | 1.0 MB 00:00 2026-03-08T22:39:52.784 INFO:teuthology.orchestra.run.vm07.stdout:(116/137): python3-logutils-0.3.5-21.el9.noarch 19 MB/s | 46 kB 00:00 2026-03-08T22:39:52.787 INFO:teuthology.orchestra.run.vm07.stdout:(117/137): python3-more-itertools-8.12.0-2.el9. 26 MB/s | 79 kB 00:00 2026-03-08T22:39:52.790 INFO:teuthology.orchestra.run.vm07.stdout:(118/137): python3-natsort-7.1.1-5.el9.noarch.r 22 MB/s | 58 kB 00:00 2026-03-08T22:39:52.796 INFO:teuthology.orchestra.run.vm07.stdout:(119/137): python3-pecan-1.4.2-3.el9.noarch.rpm 46 MB/s | 272 kB 00:00 2026-03-08T22:39:52.801 INFO:teuthology.orchestra.run.vm07.stdout:(120/137): python3-portend-3.1.0-2.el9.noarch.r 4.0 MB/s | 16 kB 00:00 2026-03-08T22:39:52.804 INFO:teuthology.orchestra.run.vm07.stdout:(121/137): python3-pyOpenSSL-21.0.0-1.el9.noarc 24 MB/s | 90 kB 00:00 2026-03-08T22:39:52.807 INFO:teuthology.orchestra.run.vm07.stdout:(122/137): python3-repoze-lru-0.7-16.el9.noarch 12 MB/s | 31 kB 00:00 2026-03-08T22:39:52.812 INFO:teuthology.orchestra.run.vm07.stdout:(123/137): python3-routes-2.5.1-5.el9.noarch.rp 43 MB/s | 188 kB 00:00 2026-03-08T22:39:52.815 INFO:teuthology.orchestra.run.vm07.stdout:(124/137): python3-rsa-4.9-2.el9.noarch.rpm 21 MB/s | 59 kB 00:00 2026-03-08T22:39:52.818 INFO:teuthology.orchestra.run.vm07.stdout:(125/137): python3-tempora-5.0.0-2.el9.noarch.r 12 MB/s | 36 kB 00:00 2026-03-08T22:39:52.822 INFO:teuthology.orchestra.run.vm07.stdout:(126/137): python3-typing-extensions-4.15.0-1.e 23 MB/s | 86 kB 00:00 2026-03-08T22:39:52.827 INFO:teuthology.orchestra.run.vm07.stdout:(127/137): python3-webob-1.8.8-2.el9.noarch.rpm 44 MB/s | 230 kB 00:00 2026-03-08T22:39:52.830 INFO:teuthology.orchestra.run.vm07.stdout:(128/137): python3-websocket-client-1.2.3-2.el9 29 MB/s | 90 kB 00:00 2026-03-08T22:39:52.838 INFO:teuthology.orchestra.run.vm07.stdout:(129/137): python3-werkzeug-2.0.3-3.el9.1.noarc 56 MB/s | 427 kB 00:00 2026-03-08T22:39:52.840 INFO:teuthology.orchestra.run.vm07.stdout:(130/137): python3-xmltodict-0.12.0-15.el9.noar 9.5 MB/s | 22 kB 00:00 2026-03-08T22:39:52.843 INFO:teuthology.orchestra.run.vm07.stdout:(131/137): python3-zc-lockfile-2.0-10.el9.noarc 7.7 MB/s | 20 kB 00:00 2026-03-08T22:39:52.849 INFO:teuthology.orchestra.run.vm07.stdout:(132/137): re2-20211101-20.el9.x86_64.rpm 36 MB/s | 191 kB 00:00 2026-03-08T22:39:52.871 INFO:teuthology.orchestra.run.vm07.stdout:(133/137): thrift-0.15.0-4.el9.x86_64.rpm 72 MB/s | 1.6 MB 00:00 2026-03-08T22:39:53.065 INFO:teuthology.orchestra.run.vm07.stdout:(134/137): python3-scipy-1.9.3-2.el9.x86_64.rpm 14 MB/s | 19 MB 00:01 2026-03-08T22:39:53.187 INFO:teuthology.orchestra.run.vm07.stdout:(135/137): protobuf-compiler-3.14.0-17.el9.x86_ 1.1 MB/s | 862 kB 00:00 2026-03-08T22:39:53.811 INFO:teuthology.orchestra.run.vm07.stdout:(136/137): librados2-19.2.3-678.ge911bdeb.el9.x 3.7 MB/s | 3.4 MB 00:00 2026-03-08T22:39:54.016 INFO:teuthology.orchestra.run.vm07.stdout:(137/137): librbd1-19.2.3-678.ge911bdeb.el9.x86 3.3 MB/s | 3.2 MB 00:00 2026-03-08T22:39:54.020 INFO:teuthology.orchestra.run.vm07.stdout:-------------------------------------------------------------------------------- 2026-03-08T22:39:54.020 INFO:teuthology.orchestra.run.vm07.stdout:Total 19 MB/s | 210 MB 00:11 2026-03-08T22:39:54.665 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:39:54.723 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:39:54.723 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:39:55.666 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:39:55.666 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:39:56.697 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:39:56.721 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/139 2026-03-08T22:39:56.736 INFO:teuthology.orchestra.run.vm07.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/139 2026-03-08T22:39:56.914 INFO:teuthology.orchestra.run.vm07.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/139 2026-03-08T22:39:56.918 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:39:56.985 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:39:57.047 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:39:57.083 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 5/139 2026-03-08T22:39:57.097 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:39:57.102 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/139 2026-03-08T22:39:57.105 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/139 2026-03-08T22:39:57.111 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/139 2026-03-08T22:39:57.122 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libnbd-1.20.3-4.el9.x86_64 10/139 2026-03-08T22:39:57.122 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:39:57.161 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:39:57.198 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:39:57.285 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 12/139 2026-03-08T22:39:57.325 INFO:teuthology.orchestra.run.vm07.stdout: Installing : re2-1:20211101-20.el9.x86_64 13/139 2026-03-08T22:39:57.380 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 14/139 2026-03-08T22:39:57.385 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 15/139 2026-03-08T22:39:57.413 INFO:teuthology.orchestra.run.vm07.stdout: Installing : liboath-2.6.12-1.el9.x86_64 16/139 2026-03-08T22:39:57.423 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 17/139 2026-03-08T22:39:57.434 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 18/139 2026-03-08T22:39:57.441 INFO:teuthology.orchestra.run.vm07.stdout: Installing : protobuf-3.14.0-17.el9.x86_64 19/139 2026-03-08T22:39:57.445 INFO:teuthology.orchestra.run.vm07.stdout: Installing : lua-5.4.4-4.el9.x86_64 20/139 2026-03-08T22:39:57.451 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 21/139 2026-03-08T22:39:57.484 INFO:teuthology.orchestra.run.vm07.stdout: Installing : unzip-6.0-59.el9.x86_64 22/139 2026-03-08T22:39:57.505 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 23/139 2026-03-08T22:39:57.510 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 24/139 2026-03-08T22:39:57.529 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 25/139 2026-03-08T22:39:57.537 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 26/139 2026-03-08T22:39:57.574 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 27/139 2026-03-08T22:39:57.584 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 28/139 2026-03-08T22:39:57.595 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 29/139 2026-03-08T22:39:57.611 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 30/139 2026-03-08T22:39:57.624 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 31/139 2026-03-08T22:39:57.660 INFO:teuthology.orchestra.run.vm07.stdout: Installing : zip-3.0-35.el9.x86_64 32/139 2026-03-08T22:39:57.667 INFO:teuthology.orchestra.run.vm07.stdout: Installing : luarocks-3.9.2-5.el9.noarch 33/139 2026-03-08T22:39:57.678 INFO:teuthology.orchestra.run.vm07.stdout: Installing : lua-devel-5.4.4-4.el9.x86_64 34/139 2026-03-08T22:39:57.712 INFO:teuthology.orchestra.run.vm07.stdout: Installing : protobuf-compiler-3.14.0-17.el9.x86_64 35/139 2026-03-08T22:39:57.783 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 36/139 2026-03-08T22:39:57.800 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 37/139 2026-03-08T22:39:57.812 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rsa-4.9-2.el9.noarch 38/139 2026-03-08T22:39:57.825 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 39/139 2026-03-08T22:39:57.836 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 40/139 2026-03-08T22:39:57.841 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 41/139 2026-03-08T22:39:57.860 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 42/139 2026-03-08T22:39:57.889 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 43/139 2026-03-08T22:39:57.897 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 44/139 2026-03-08T22:39:57.904 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 45/139 2026-03-08T22:39:57.963 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 46/139 2026-03-08T22:39:57.980 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 47/139 2026-03-08T22:39:57.992 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 48/139 2026-03-08T22:39:58.061 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 49/139 2026-03-08T22:39:58.069 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 50/139 2026-03-08T22:39:58.112 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 51/139 2026-03-08T22:39:58.177 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 52/139 2026-03-08T22:39:58.605 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 53/139 2026-03-08T22:39:58.623 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 54/139 2026-03-08T22:39:58.629 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 55/139 2026-03-08T22:39:58.637 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 56/139 2026-03-08T22:39:58.642 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 57/139 2026-03-08T22:39:58.649 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 58/139 2026-03-08T22:39:58.654 INFO:teuthology.orchestra.run.vm07.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 59/139 2026-03-08T22:39:58.656 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 60/139 2026-03-08T22:39:58.689 INFO:teuthology.orchestra.run.vm07.stdout: Installing : grpc-data-1.46.7-10.el9.noarch 61/139 2026-03-08T22:39:58.817 INFO:teuthology.orchestra.run.vm07.stdout: Installing : abseil-cpp-20211102.0-4.el9.x86_64 62/139 2026-03-08T22:39:58.832 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-grpcio-1.46.7-10.el9.x86_64 63/139 2026-03-08T22:39:58.840 INFO:teuthology.orchestra.run.vm07.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 64/139 2026-03-08T22:39:58.847 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 65/139 2026-03-08T22:39:58.856 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 66/139 2026-03-08T22:39:58.861 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 67/139 2026-03-08T22:39:58.872 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 68/139 2026-03-08T22:39:58.878 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 69/139 2026-03-08T22:39:58.920 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 70/139 2026-03-08T22:39:58.935 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-protobuf-3.14.0-17.el9.noarch 71/139 2026-03-08T22:39:58.981 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-grpcio-tools-1.46.7-10.el9.x86_64 72/139 2026-03-08T22:39:59.284 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 73/139 2026-03-08T22:39:59.320 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 74/139 2026-03-08T22:39:59.326 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 75/139 2026-03-08T22:39:59.391 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-0.3.29-1.el9.x86_64 76/139 2026-03-08T22:39:59.393 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 77/139 2026-03-08T22:39:59.418 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 78/139 2026-03-08T22:39:59.855 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 79/139 2026-03-08T22:39:59.948 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 80/139 2026-03-08T22:40:00.799 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 81/139 2026-03-08T22:40:00.830 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:40:00.838 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 83/139 2026-03-08T22:40:00.845 INFO:teuthology.orchestra.run.vm07.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 84/139 2026-03-08T22:40:01.023 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 85/139 2026-03-08T22:40:01.026 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:40:01.059 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 86/139 2026-03-08T22:40:01.063 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 87/139 2026-03-08T22:40:01.071 INFO:teuthology.orchestra.run.vm07.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 88/139 2026-03-08T22:40:01.335 INFO:teuthology.orchestra.run.vm07.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 89/139 2026-03-08T22:40:01.338 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:40:01.360 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 90/139 2026-03-08T22:40:01.362 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 91/139 2026-03-08T22:40:02.544 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:40:02.550 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:40:02.572 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 92/139 2026-03-08T22:40:02.585 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyparsing-2.4.7-9.el9.noarch 93/139 2026-03-08T22:40:02.596 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-packaging-20.9-5.el9.noarch 94/139 2026-03-08T22:40:02.616 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ply-3.11-14.el9.noarch 95/139 2026-03-08T22:40:02.637 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 96/139 2026-03-08T22:40:02.734 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 97/139 2026-03-08T22:40:02.750 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 98/139 2026-03-08T22:40:02.779 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 99/139 2026-03-08T22:40:02.817 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 100/139 2026-03-08T22:40:02.886 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 101/139 2026-03-08T22:40:02.897 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 102/139 2026-03-08T22:40:02.904 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:40:02.912 INFO:teuthology.orchestra.run.vm07.stdout: Installing : pciutils-3.7.0-7.el9.x86_64 104/139 2026-03-08T22:40:02.919 INFO:teuthology.orchestra.run.vm07.stdout: Installing : qatlib-25.08.0-2.el9.x86_64 105/139 2026-03-08T22:40:02.921 INFO:teuthology.orchestra.run.vm07.stdout: Installing : qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:40:02.941 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 106/139 2026-03-08T22:40:03.266 INFO:teuthology.orchestra.run.vm07.stdout: Installing : qatzip-libs-1.3.1-1.el9.x86_64 107/139 2026-03-08T22:40:03.273 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:40:03.324 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 108/139 2026-03-08T22:40:03.324 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-08T22:40:03.324 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-08T22:40:03.324 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:03.328 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 109/139 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-08T22:40:10.566 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:10.702 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:40:10.726 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 110/139 2026-03-08T22:40:10.727 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:10.727 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T22:40:10.727 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:40:10.727 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T22:40:10.727 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:10.961 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 111/139 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T22:40:10.987 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:10.997 INFO:teuthology.orchestra.run.vm07.stdout: Installing : mailcap-2.1.49-5.el9.noarch 112/139 2026-03-08T22:40:11.000 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 113/139 2026-03-08T22:40:11.022 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:40:11.022 INFO:teuthology.orchestra.run.vm07.stdout:Creating group 'qat' with GID 994. 2026-03-08T22:40:11.022 INFO:teuthology.orchestra.run.vm07.stdout:Creating group 'libstoragemgmt' with GID 993. 2026-03-08T22:40:11.022 INFO:teuthology.orchestra.run.vm07.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 993 and GID 993. 2026-03-08T22:40:11.022 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:11.035 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:40:11.064 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 114/139 2026-03-08T22:40:11.064 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-08T22:40:11.064 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:11.107 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 115/139 2026-03-08T22:40:11.195 INFO:teuthology.orchestra.run.vm07.stdout: Installing : cryptsetup-2.8.1-3.el9.x86_64 116/139 2026-03-08T22:40:11.201 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:40:11.215 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 117/139 2026-03-08T22:40:11.215 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:11.215 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T22:40:11.215 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:12.040 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 118/139 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T22:40:12.067 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:12.129 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:40:12.133 INFO:teuthology.orchestra.run.vm07.stdout: Installing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 119/139 2026-03-08T22:40:12.141 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 120/139 2026-03-08T22:40:12.167 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 121/139 2026-03-08T22:40:12.170 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:40:12.741 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 122/139 2026-03-08T22:40:12.770 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:40:13.344 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 123/139 2026-03-08T22:40:13.383 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:40:13.455 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 124/139 2026-03-08T22:40:13.673 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 125/139 2026-03-08T22:40:13.682 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 126/139 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T22:40:13.707 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:13.869 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:40:13.883 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 127/139 2026-03-08T22:40:14.497 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 128/139 2026-03-08T22:40:14.500 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 129/139 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:40:14.524 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:14.536 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:40:14.558 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 130/139 2026-03-08T22:40:14.558 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:14.558 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T22:40:14.558 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:14.720 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 131/139 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T22:40:14.745 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:17.385 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 132/139 2026-03-08T22:40:17.396 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 133/139 2026-03-08T22:40:17.402 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 134/139 2026-03-08T22:40:17.458 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 135/139 2026-03-08T22:40:17.468 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:40:17.472 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 137/139 2026-03-08T22:40:17.473 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:40:17.490 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 138/139 2026-03-08T22:40:17.490 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 4/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 6/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 7/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 9/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 10/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 11/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 12/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_6 13/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 14/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 15/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 16/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 17/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 18/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9 19/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 20/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 21/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 22/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 23/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 24/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 25/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 26/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 27/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 28/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 29/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 30/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 31/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 32/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 33/139 2026-03-08T22:40:19.023 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 34/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 35/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 36/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 37/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 38/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 39/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 40/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 41/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 42/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 43/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 45/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 46/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 47/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 48/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 49/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 50/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : unzip-6.0-59.el9.x86_64 51/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : zip-3.0-35.el9.x86_64 52/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 53/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 54/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 55/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 56/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 57/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 58/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 59/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 60/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 61/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 62/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 63/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lua-5.4.4-4.el9.x86_64 64/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 65/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 66/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 67/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 68/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 69/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 70/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 71/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 72/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 73/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 74/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 75/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 76/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 77/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 79/139 2026-03-08T22:40:19.026 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 80/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 81/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 82/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 83/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 84/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 85/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 86/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 87/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 88/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 89/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 90/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 91/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 92/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 93/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 94/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 95/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 96/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 97/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 98/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 99/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 100/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 101/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 102/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 103/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 104/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 105/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 106/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 107/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 108/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 109/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 110/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 111/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 112/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 113/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 114/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 115/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 116/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 117/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 118/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 119/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 120/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 121/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 122/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 123/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 124/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 125/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 126/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 127/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 128/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 129/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 130/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 131/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 132/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 133/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 134/139 2026-03-08T22:40:19.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 135/139 2026-03-08T22:40:19.028 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 136/139 2026-03-08T22:40:19.028 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 137/139 2026-03-08T22:40:19.028 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 138/139 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 139/139 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout:Upgraded: 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout:Installed: 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.135 INFO:teuthology.orchestra.run.vm07.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T22:40:19.136 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: zip-3.0-35.el9.x86_64 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:40:19.137 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:40:19.234 DEBUG:teuthology.parallel:result is None 2026-03-08T22:40:19.234 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:40:19.839 DEBUG:teuthology.orchestra.run.vm07:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-08T22:40:19.861 INFO:teuthology.orchestra.run.vm07.stdout:19.2.3-678.ge911bdeb.el9 2026-03-08T22:40:19.861 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678.ge911bdeb.el9 2026-03-08T22:40:19.862 INFO:teuthology.task.install:The correct ceph version 19.2.3-678.ge911bdeb is installed. 2026-03-08T22:40:19.863 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:40:19.863 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:40:19.863 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:40:19.931 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:40:19.931 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:40:19.931 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:40:19.999 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:40:20.070 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:40:20.070 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:40:20.070 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:40:20.137 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:40:20.205 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:40:20.205 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:40:20.205 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:40:20.274 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:40:20.344 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:40:20.348 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:40:20.348 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:40:20.348 INFO:tasks.workunit:timeout=3h 2026-03-08T22:40:20.348 INFO:tasks.workunit:cleanup=True 2026-03-08T22:40:20.348 DEBUG:teuthology.orchestra.run.vm07:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:40:20.402 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:40:20.402 INFO:teuthology.orchestra.run.vm07.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:40:20.402 DEBUG:teuthology.orchestra.run.vm07:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:40:20.459 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:40:20.459 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:40:20.515 DEBUG:teuthology.orchestra.run.vm07:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:40:20.572 INFO:tasks.workunit.client.0.vm07.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:41:36.269 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: git switch -c 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr:Or undo this operation with: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: git switch - 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-08T22:41:36.270 INFO:tasks.workunit.client.0.vm07.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:41:36.275 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:41:36.334 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-08T22:41:36.334 DEBUG:teuthology.orchestra.run.vm07:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:41:36.392 INFO:tasks.workunit:Running workunits matching misc on client.0... 2026-03-08T22:41:36.392 INFO:tasks.workunit:Running workunit misc/mclock-config.sh... 2026-03-08T22:41:36.392 DEBUG:teuthology.orchestra.run.vm07:workunit test misc/mclock-config.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh 2026-03-08T22:41:36.455 INFO:tasks.workunit.client.0.vm07.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/home/ubuntu/.local/bin:/home/ubuntu/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/sbin 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/mclock-config 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:21: run: local dir=td/mclock-config 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:22: run: shift 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:24: run: export CEPH_MON=127.0.0.1:7124 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:24: run: CEPH_MON=127.0.0.1:7124 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:25: run: export CEPH_ARGS 2026-03-08T22:41:36.459 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:26: run: uuidgen 2026-03-08T22:41:36.460 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:26: run: CEPH_ARGS+='--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none ' 2026-03-08T22:41:36.460 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7124 ' 2026-03-08T22:41:36.460 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:28: run: CEPH_ARGS+='--debug-mclock 20 ' 2026-03-08T22:41:36.460 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:30: run: set 2026-03-08T22:41:36.460 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:30: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:30: run: local 'funcs=TEST_backfill_limit_adjustment_mclock 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:TEST_profile_builtin_to_custom 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:TEST_profile_custom_to_builtin 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:TEST_profile_disallow_builtin_params_modify 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:TEST_profile_disallow_builtin_params_override 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:TEST_recovery_limit_adjustment_mclock' 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:36.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:36.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:36.464 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:36.464 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:36.464 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:36.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:36.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:36.465 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:36.466 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:36.466 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:36.466 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:36.466 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:36.467 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:36.468 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:36.468 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:41:36.469 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:36.469 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.469 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.469 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:41:36.470 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:36.470 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:36.470 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:41:36.471 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:36.471 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.471 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.471 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:41:36.472 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:36.472 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:41:36.472 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_backfill_limit_adjustment_mclock td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:236: TEST_backfill_limit_adjustment_mclock: local dir=td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:238: TEST_backfill_limit_adjustment_mclock: setup td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:36.473 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:36.475 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:36.475 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:36.476 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:36.476 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:36.477 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:36.477 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:36.477 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:36.478 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:36.478 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:36.478 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:36.478 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:36.479 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:36.480 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:36.480 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:41:36.481 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:36.481 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.481 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.481 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:41:36.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:36.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:36.483 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:41:36.484 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:36.484 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.484 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.484 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:41:36.485 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:239: TEST_backfill_limit_adjustment_mclock: run_mon td/mclock-config a 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:41:36.486 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:36.543 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:41:36.576 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:36.576 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:36.576 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:36.576 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:36.576 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:36.577 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:36.577 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:36.577 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:36.577 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:36.577 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:36.578 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.578 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.578 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:41:36.578 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:36.578 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:36.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:240: TEST_backfill_limit_adjustment_mclock: run_mgr td/mclock-config x 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:41:36.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:36.807 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:36.808 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:36.829 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:242: TEST_backfill_limit_adjustment_mclock: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:41:36.830 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:36.833 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:41:36.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:41:36.835 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:36.836 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:36.836 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:36.836 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 1e15f817-9a26-4d1a-bd24-45a6cd99b52b' 2026-03-08T22:41:36.836 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:36.849 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAg+61pYn60MhAAcvGfmOIUQKxWysE//uERhw== 2026-03-08T22:41:36.849 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAg+61pYn60MhAAcvGfmOIUQKxWysE//uERhw=="}' 2026-03-08T22:41:36.849 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1e15f817-9a26-4d1a-bd24-45a6cd99b52b -i td/mclock-config/0/new.json 2026-03-08T22:41:36.979 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:36.994 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:41:36.995 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQAg+61pYn60MhAAcvGfmOIUQKxWysE//uERhw== --osd-uuid 1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:37.020 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:37.019+0000 7f76e881a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:37.021 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:37.022+0000 7f76e881a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:37.024 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:37.025+0000 7f76e881a780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:37.025 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:37.025+0000 7f76e881a780 -1 bdev(0x55eb2bdfa800 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:37.025 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:37.025+0000 7f76e881a780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:41:39.158 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:41:39.158 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:39.158 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:39.159 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:41:39.159 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:39.292 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:39.292 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:41:39.292 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:39.293 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:39.295 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:39.299 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:41:39.344 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:39.341+0000 7f4bdb2ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.348 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:39.348+0000 7f4bdb2ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.357 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:39.358+0000 7f4bdb2ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:39.453 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:39.454 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:39.588 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:40.589 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:40.589 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:40.589 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:40.590 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:41:40.590 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:40.590 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:40.681 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:40.682+0000 7f4bdb2ba780 -1 Falling back to public interface 2026-03-08T22:41:40.807 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:41.545 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:41.546+0000 7f4bdb2ba780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:41:41.808 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:41:41.808 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:41.808 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:41.808 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:41.808 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:41.809 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:42.061 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:43.064 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:43.306 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stdout:4 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:44.308 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:44.539 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3546783394,v1:127.0.0.1:6803/3546783394] [v2:127.0.0.1:6804/3546783394,v1:127.0.0.1:6805/3546783394] exists,up 1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:44.539 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:44.539 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:44.539 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:44.540 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:44.540 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:44.540 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:44.540 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: jq .osd_max_backfills 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:44.541 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_max_backfills 2026-03-08T22:41:44.603 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:244: TEST_backfill_limit_adjustment_mclock: local backfills=1 2026-03-08T22:41:44.603 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:246: TEST_backfill_limit_adjustment_mclock: echo 'osd_max_backfills: 1' 2026-03-08T22:41:44.603 INFO:tasks.workunit.client.0.vm07.stdout:osd_max_backfills: 1 2026-03-08T22:41:44.603 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:251: TEST_backfill_limit_adjustment_mclock: ceph config set osd.0 osd_max_backfills 20 2026-03-08T22:41:44.834 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:252: TEST_backfill_limit_adjustment_mclock: sleep 2 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: jq .osd_max_backfills 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:46.839 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_max_backfills 2026-03-08T22:41:46.894 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:254: TEST_backfill_limit_adjustment_mclock: local max_backfills=1 2026-03-08T22:41:46.894 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:255: TEST_backfill_limit_adjustment_mclock: test 1 = 1 2026-03-08T22:41:46.895 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:46.895 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:46.895 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:46.895 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: jq .local_reservations.max_allowed 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:46.896 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:46.958 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:259: TEST_backfill_limit_adjustment_mclock: max_backfills=1 2026-03-08T22:41:46.958 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:260: TEST_backfill_limit_adjustment_mclock: test 1 = 1 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:46.959 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:46.960 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:46.960 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: jq .remote_reservations.max_allowed 2026-03-08T22:41:46.960 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:46.960 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:47.014 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:262: TEST_backfill_limit_adjustment_mclock: max_backfills=1 2026-03-08T22:41:47.014 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:263: TEST_backfill_limit_adjustment_mclock: test 1 = 1 2026-03-08T22:41:47.014 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:267: TEST_backfill_limit_adjustment_mclock: ceph config set osd.0 osd_mclock_override_recovery_settings true 2026-03-08T22:41:47.247 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:268: TEST_backfill_limit_adjustment_mclock: ceph config set osd.0 osd_max_backfills 20 2026-03-08T22:41:47.467 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:269: TEST_backfill_limit_adjustment_mclock: sleep 2 2026-03-08T22:41:49.469 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:49.469 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:49.469 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:49.469 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: jq .osd_max_backfills 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:49.470 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_max_backfills 2026-03-08T22:41:49.531 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:271: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:49.531 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:272: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:49.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:49.531 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: jq .local_reservations.max_allowed 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:49.532 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:49.591 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:276: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:49.591 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:277: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:49.591 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: jq .remote_reservations.max_allowed 2026-03-08T22:41:49.592 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:49.593 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:49.593 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:279: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:280: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:284: TEST_backfill_limit_adjustment_mclock: kill_daemons td/mclock-config TERM osd 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:49.648 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:49.753 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:49.753 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:285: TEST_backfill_limit_adjustment_mclock: ceph osd down 0 2026-03-08T22:41:49.991 INFO:tasks.workunit.client.0.vm07.stderr:osd.0 is already down. 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:286: TEST_backfill_limit_adjustment_mclock: wait_for_osd down 0 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=down 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:50.003 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:50.004 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 down' 2026-03-08T22:41:50.230 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 down in weight 1 up_from 5 up_thru 0 down_at 6 last_clean_interval [0,0) [v2:127.0.0.1:6802/3546783394,v1:127.0.0.1:6803/3546783394] [v2:127.0.0.1:6804/3546783394,v1:127.0.0.1:6805/3546783394] exists 1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:50.230 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:50.230 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:50.230 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:50.230 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:287: TEST_backfill_limit_adjustment_mclock: activate_osd td/mclock-config 0 --osd-op-queue=mclock_scheduler 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/mclock-config 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/mclock-config/0 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:50.231 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+=--osd-op-queue=mclock_scheduler 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/mclock-config/0 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:41:50.232 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:41:50.233 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-op-queue=mclock_scheduler 2026-03-08T22:41:50.233 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/mclock-config/0/whoami 2026-03-08T22:41:50.234 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:41:50.234 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:41:50.235 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:41:50.237 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:41:50.251 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:50.251+0000 7fd045e9d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.256 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:50.257+0000 7fd045e9d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.258 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:50.258+0000 7fd045e9d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:50.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:50.463 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:50.681 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:50.814 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:50.814+0000 7fd045e9d780 -1 Falling back to public interface 2026-03-08T22:41:51.681 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:51.682+0000 7fd045e9d780 -1 osd.0 6 log_to_monitors true 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:51.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:51.908 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:52.910 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:41:52.913 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:52.913 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:52.913 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:41:52.913 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:52.913 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:53.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:54.126 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:41:54.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:54.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:54.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:41:54.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:54.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:54.345 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 8 up_thru 0 down_at 6 last_clean_interval [5,6) [v2:127.0.0.1:6802/2031056314,v1:127.0.0.1:6803/2031056314] [v2:127.0.0.1:6804/2031056314,v1:127.0.0.1:6805/2031056314] exists,up 1e15f817-9a26-4d1a-bd24-45a6cd99b52b 2026-03-08T22:41:54.345 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:41:54.345 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:41:54.345 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.346 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.347 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.347 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:54.347 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: jq .osd_max_backfills 2026-03-08T22:41:54.347 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:54.347 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_max_backfills 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:290: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:291: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.405 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.406 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:54.406 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:54.406 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:54.406 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: jq .local_reservations.max_allowed 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:295: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:296: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: bc 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.465 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.466 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.466 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:41:54.466 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: jq .remote_reservations.max_allowed 2026-03-08T22:41:54.466 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:41:54.466 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok dump_recovery_reservations 2026-03-08T22:41:54.525 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:298: TEST_backfill_limit_adjustment_mclock: max_backfills=20 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:299: TEST_backfill_limit_adjustment_mclock: test 20 = 20 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:301: TEST_backfill_limit_adjustment_mclock: teardown td/mclock-config 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:54.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:54.639 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:54.639 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:54.640 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:54.640 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:54.641 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:54.641 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:54.642 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:54.642 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.642 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:54.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:54.643 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.644 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:54.645 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:54.645 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:41:54.650 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:54.650 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.650 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.650 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:34: run: teardown td/mclock-config 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:54.651 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:41:54.652 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:54.652 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:54.652 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:54.652 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:54.652 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:54.654 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:54.654 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:54.655 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:54.656 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:54.657 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:54.657 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:54.657 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:54.658 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.658 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:54.658 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:54.658 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.659 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:54.660 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:54.660 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:41:54.661 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:54.661 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.661 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.661 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:41:54.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:41:54.663 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:41:54.663 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:41:54.663 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:41:54.663 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:41:54.663 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:41:54.665 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:41:54.665 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:41:54.666 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:41:54.667 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:41:54.667 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:41:54.667 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:41:54.668 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:41:54.668 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.669 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:41:54.669 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:41:54.669 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:41:54.670 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:41:54.671 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:41:54.671 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:41:54.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:41:54.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.672 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:41:54.673 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:41:54.673 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:41:54.673 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:41:54.674 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:41:54.674 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.674 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.674 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_profile_builtin_to_custom td/mclock-config 2026-03-08T22:41:54.675 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:39: TEST_profile_builtin_to_custom: local dir=td/mclock-config 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:41: TEST_profile_builtin_to_custom: run_mon td/mclock-config a 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:41:54.676 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.701 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.702 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:54.702 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:41:54.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:41:54.732 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:41:54.732 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:54.732 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:54.732 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:41:54.733 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:41:54.733 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:54.733 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:54.733 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:54.734 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:41:54.783 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:41:54.784 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:42: TEST_profile_builtin_to_custom: run_mgr td/mclock-config x 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:41:54.837 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.950 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.951 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:54.951 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:54.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:41:54.973 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:43: TEST_profile_builtin_to_custom: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:41:54.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:41:54.977 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:41:54.978 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:41:54.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:41:54.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=42809959-8f84-4ff0-851a-f6c0d3ef7787 2026-03-08T22:41:54.980 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 42809959-8f84-4ff0-851a-f6c0d3ef7787 2026-03-08T22:41:54.980 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 42809959-8f84-4ff0-851a-f6c0d3ef7787' 2026-03-08T22:41:54.980 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:41:54.992 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAy+61p20JCOxAAPnhEnT524UxNvJhR9Ty46g== 2026-03-08T22:41:54.992 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAy+61p20JCOxAAPnhEnT524UxNvJhR9Ty46g=="}' 2026-03-08T22:41:54.992 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 42809959-8f84-4ff0-851a-f6c0d3ef7787 -i td/mclock-config/0/new.json 2026-03-08T22:41:55.118 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:55.132 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:41:55.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQAy+61p20JCOxAAPnhEnT524UxNvJhR9Ty46g== --osd-uuid 42809959-8f84-4ff0-851a-f6c0d3ef7787 2026-03-08T22:41:55.153 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:55.152+0000 7f4a74425780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.154 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:55.155+0000 7f4a74425780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.156 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:55.156+0000 7f4a74425780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:55.156 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:55.156+0000 7f4a74425780 -1 bdev(0x5585a0df6800 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:41:55.156 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:55.156+0000 7f4a74425780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:41:57.261 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:41:57.261 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:41:57.261 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:41:57.262 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:41:57.262 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:41:57.401 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:41:57.401 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:41:57.402 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:41:57.402 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:41:57.405 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:41:57.408 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:41:57.426 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:57.426+0000 7f5d5d29d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.427 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:57.428+0000 7f5d5d29d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.429 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:57.429+0000 7f5d5d29d780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:57.628 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:57.856 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:58.490 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:58.490+0000 7f5d5d29d780 -1 Falling back to public interface 2026-03-08T22:41:58.858 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:41:58.858 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:41:58.858 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:41:58.858 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:41:58.859 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:41:58.859 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:41:59.073 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:41:59.395 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:41:59.396+0000 7f5d5d29d780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:00.074 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:42:00.074 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:00.074 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:00.074 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:00.075 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:00.075 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:00.315 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:01.316 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:42:01.317 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:01.317 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:01.317 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:01.318 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:01.318 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:01.552 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:01.986 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:01.986+0000 7f5d58a3e640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:42:02.554 INFO:tasks.workunit.client.0.vm07.stdout:4 2026-03-08T22:42:02.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:02.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:02.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:42:02.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:02.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:02.777 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1370478576,v1:127.0.0.1:6803/1370478576] [v2:127.0.0.1:6804/1370478576,v1:127.0.0.1:6805/1370478576] exists,up 42809959-8f84-4ff0-851a-f6c0d3ef7787 2026-03-08T22:42:02.777 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:02.777 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:02.777 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:02.777 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:46: TEST_profile_builtin_to_custom: ceph config get osd.0 osd_mclock_profile 2026-03-08T22:42:02.999 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:46: TEST_profile_builtin_to_custom: local mclock_profile=balanced 2026-03-08T22:42:02.999 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:47: TEST_profile_builtin_to_custom: test balanced = balanced 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:52: TEST_profile_builtin_to_custom: jq .osd_mclock_profile 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:52: TEST_profile_builtin_to_custom: get_asok_path osd.0 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.000 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:03.001 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:52: TEST_profile_builtin_to_custom: CEPH_ARGS= 2026-03-08T22:42:03.001 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:52: TEST_profile_builtin_to_custom: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:03.054 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:52: TEST_profile_builtin_to_custom: mclock_profile='"high_recovery_ops"' 2026-03-08T22:42:03.055 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:53: TEST_profile_builtin_to_custom: eval echo '"high_recovery_ops"' 2026-03-08T22:42:03.055 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:53: TEST_profile_builtin_to_custom: echo high_recovery_ops 2026-03-08T22:42:03.055 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:53: TEST_profile_builtin_to_custom: mclock_profile=high_recovery_ops 2026-03-08T22:42:03.055 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:54: TEST_profile_builtin_to_custom: test high_recovery_ops = high_recovery_ops 2026-03-08T22:42:03.055 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:57: TEST_profile_builtin_to_custom: ceph tell osd.0 config set osd_mclock_profile custom 2026-03-08T22:42:03.130 INFO:tasks.workunit.client.0.vm07.stdout:{ 2026-03-08T22:42:03.130 INFO:tasks.workunit.client.0.vm07.stdout: "success": "" 2026-03-08T22:42:03.130 INFO:tasks.workunit.client.0.vm07.stdout:} 2026-03-08T22:42:03.140 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:61: TEST_profile_builtin_to_custom: jq .osd_mclock_profile 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:61: TEST_profile_builtin_to_custom: get_asok_path osd.0 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.141 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:03.142 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:61: TEST_profile_builtin_to_custom: CEPH_ARGS= 2026-03-08T22:42:03.142 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:61: TEST_profile_builtin_to_custom: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:03.197 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:61: TEST_profile_builtin_to_custom: mclock_profile='"custom"' 2026-03-08T22:42:03.197 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:62: TEST_profile_builtin_to_custom: eval echo '"custom"' 2026-03-08T22:42:03.197 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:62: TEST_profile_builtin_to_custom: echo custom 2026-03-08T22:42:03.197 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:62: TEST_profile_builtin_to_custom: mclock_profile=custom 2026-03-08T22:42:03.197 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:63: TEST_profile_builtin_to_custom: test custom = custom 2026-03-08T22:42:03.198 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: bc 2026-03-08T22:42:03.198 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: get_asok_path osd.0 2026-03-08T22:42:03.198 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: CEPH_ARGS= 2026-03-08T22:42:03.199 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:03.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:68: TEST_profile_builtin_to_custom: local client_res=0.300000 2026-03-08T22:42:03.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:69: TEST_profile_builtin_to_custom: echo 'client_res = 0.300000' 2026-03-08T22:42:03.256 INFO:tasks.workunit.client.0.vm07.stdout:client_res = 0.300000 2026-03-08T22:42:03.257 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:70: TEST_profile_builtin_to_custom: echo '0.300000 + 0.1' 2026-03-08T22:42:03.257 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:70: TEST_profile_builtin_to_custom: bc -l 2026-03-08T22:42:03.257 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:70: TEST_profile_builtin_to_custom: local client_res_new=.400000 2026-03-08T22:42:03.257 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:71: TEST_profile_builtin_to_custom: echo 'client_res_new = .400000' 2026-03-08T22:42:03.258 INFO:tasks.workunit.client.0.vm07.stdout:client_res_new = .400000 2026-03-08T22:42:03.258 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:72: TEST_profile_builtin_to_custom: ceph config set osd.0 osd_mclock_scheduler_client_res .400000 2026-03-08T22:42:03.481 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:77: TEST_profile_builtin_to_custom: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:03.709 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:77: TEST_profile_builtin_to_custom: local res=0.400000 2026-03-08T22:42:03.709 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:78: TEST_profile_builtin_to_custom: echo '0.400000 != .400000' 2026-03-08T22:42:03.709 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:78: TEST_profile_builtin_to_custom: bc -l 2026-03-08T22:42:03.710 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:78: TEST_profile_builtin_to_custom: (( 0 )) 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: bc 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: get_asok_path osd.0 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:03.711 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.712 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.712 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:03.712 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:03.712 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: CEPH_ARGS= 2026-03-08T22:42:03.712 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:03.772 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:84: TEST_profile_builtin_to_custom: res=0.400000 2026-03-08T22:42:03.772 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:85: TEST_profile_builtin_to_custom: echo '0.400000 != .400000' 2026-03-08T22:42:03.772 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:85: TEST_profile_builtin_to_custom: bc -l 2026-03-08T22:42:03.773 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:85: TEST_profile_builtin_to_custom: (( 0 )) 2026-03-08T22:42:03.773 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:89: TEST_profile_builtin_to_custom: teardown td/mclock-config 2026-03-08T22:42:03.773 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:03.773 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:03.774 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:03.900 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:03.900 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:03.901 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:03.901 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:03.902 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:03.902 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:03.902 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:03.903 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.903 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:03.903 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:03.904 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.904 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:03.905 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:03.905 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:03.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:03.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.911 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:03.911 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:34: run: teardown td/mclock-config 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:03.912 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:03.914 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:03.915 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:03.915 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:03.916 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:03.917 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:03.917 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:03.917 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:03.918 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.918 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:03.918 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:03.918 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.919 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:03.920 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:03.920 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:03.921 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:03.921 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.921 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.922 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:03.922 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:03.923 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:03.926 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:03.926 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:03.927 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:03.927 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:03.928 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:03.928 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:03.929 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:03.929 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.929 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:03.930 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:03.930 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.931 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:03.932 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:03.932 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:03.933 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:03.933 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.933 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.933 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:03.934 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:03.934 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:03.934 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:03.935 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:03.935 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.936 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.936 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_profile_custom_to_builtin td/mclock-config 2026-03-08T22:42:03.937 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:93: TEST_profile_custom_to_builtin: local dir=td/mclock-config 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:95: TEST_profile_custom_to_builtin: setup td/mclock-config 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:03.938 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:03.940 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:03.941 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:03.941 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:03.942 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:03.943 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:03.943 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:03.943 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:03.944 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.944 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:03.944 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:03.944 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:03.945 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:03.946 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:03.946 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:03.947 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:03.947 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.947 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.948 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:03.948 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:03.948 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:03.949 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:03.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:03.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.950 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.950 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:03.951 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:03.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:03.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:96: TEST_profile_custom_to_builtin: run_mon td/mclock-config a 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:42:03.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:03.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:03.980 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:42:04.008 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:04.008 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:04.008 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:04.008 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:04.009 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:04.009 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:04.009 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:04.009 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:04.009 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:04.011 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:04.060 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:04.061 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:97: TEST_profile_custom_to_builtin: run_mgr td/mclock-config x 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:42:04.110 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:04.223 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:04.224 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:04.225 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:04.248 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:98: TEST_profile_custom_to_builtin: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:42:04.248 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:42:04.248 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:04.249 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:04.250 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:42:04.251 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:42:04.252 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:04.253 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=86d2eae0-9fb9-4abf-92f5-04d346fc9f13 2026-03-08T22:42:04.253 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 86d2eae0-9fb9-4abf-92f5-04d346fc9f13' 2026-03-08T22:42:04.253 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 86d2eae0-9fb9-4abf-92f5-04d346fc9f13 2026-03-08T22:42:04.253 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:04.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQA8+61ppuPqDxAA1XANCmo/Zp3tpmpq6H3NvQ== 2026-03-08T22:42:04.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQA8+61ppuPqDxAA1XANCmo/Zp3tpmpq6H3NvQ=="}' 2026-03-08T22:42:04.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 86d2eae0-9fb9-4abf-92f5-04d346fc9f13 -i td/mclock-config/0/new.json 2026-03-08T22:42:04.388 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:04.401 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:42:04.401 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQA8+61ppuPqDxAA1XANCmo/Zp3tpmpq6H3NvQ== --osd-uuid 86d2eae0-9fb9-4abf-92f5-04d346fc9f13 2026-03-08T22:42:04.420 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:04.420+0000 7f0cf06ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:04.425 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:04.426+0000 7f0cf06ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:04.428 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:04.428+0000 7f0cf06ba780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:04.428 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:04.428+0000 7f0cf06ba780 -1 bdev(0x558e777dbc00 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:04.428 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:04.428+0000 7f0cf06ba780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:42:06.531 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:42:06.531 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:06.531 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:42:06.532 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:06.532 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:06.821 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:06.822 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:42:06.822 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:42:06.822 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:06.823 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:06.825 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:06.839 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:06.840+0000 7f5f45db4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:06.841 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:06.842+0000 7f5f45db4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:06.843 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:06.843+0000 7f5f45db4780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:07.045 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:07.046 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:07.433 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:07.433+0000 7f5f45db4780 -1 Falling back to public interface 2026-03-08T22:42:07.565 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:08.568 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:08.591 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:08.591+0000 7f5f45db4780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:08.801 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:09.803 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:42:09.803 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:09.803 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:09.803 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:09.804 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:09.804 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:10.036 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:10.232 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:10.233+0000 7f5f41555640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:11.038 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1893245919,v1:127.0.0.1:6803/1893245919] [v2:127.0.0.1:6804/1893245919,v1:127.0.0.1:6805/1893245919] exists,up 86d2eae0-9fb9-4abf-92f5-04d346fc9f13 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:101: TEST_profile_custom_to_builtin: local def_mclock_profile 2026-03-08T22:42:11.256 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:102: TEST_profile_custom_to_builtin: ceph config get osd.0 osd_mclock_profile 2026-03-08T22:42:11.481 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:102: TEST_profile_custom_to_builtin: def_mclock_profile=balanced 2026-03-08T22:42:11.481 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:103: TEST_profile_custom_to_builtin: test balanced = balanced 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:108: TEST_profile_custom_to_builtin: jq .osd_mclock_profile 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:108: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:11.482 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:11.483 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:108: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:11.483 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:108: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:11.543 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:108: TEST_profile_custom_to_builtin: local 'orig_mclock_profile="high_recovery_ops"' 2026-03-08T22:42:11.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:109: TEST_profile_custom_to_builtin: eval echo '"high_recovery_ops"' 2026-03-08T22:42:11.543 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:109: TEST_profile_custom_to_builtin: echo high_recovery_ops 2026-03-08T22:42:11.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:109: TEST_profile_custom_to_builtin: orig_mclock_profile=high_recovery_ops 2026-03-08T22:42:11.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:110: TEST_profile_custom_to_builtin: test high_recovery_ops = high_recovery_ops 2026-03-08T22:42:11.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:113: TEST_profile_custom_to_builtin: ceph tell osd.0 config set osd_mclock_profile custom 2026-03-08T22:42:11.618 INFO:tasks.workunit.client.0.vm07.stdout:{ 2026-03-08T22:42:11.618 INFO:tasks.workunit.client.0.vm07.stdout: "success": "" 2026-03-08T22:42:11.618 INFO:tasks.workunit.client.0.vm07.stdout:} 2026-03-08T22:42:11.627 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:118: TEST_profile_custom_to_builtin: jq .osd_mclock_profile 2026-03-08T22:42:11.627 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:118: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:11.627 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:11.627 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:118: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:11.628 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:118: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:118: TEST_profile_custom_to_builtin: local 'mclock_profile="custom"' 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:119: TEST_profile_custom_to_builtin: eval echo '"custom"' 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:119: TEST_profile_custom_to_builtin: echo custom 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:119: TEST_profile_custom_to_builtin: mclock_profile=custom 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:120: TEST_profile_custom_to_builtin: test custom = custom 2026-03-08T22:42:11.684 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:123: TEST_profile_custom_to_builtin: local client_res 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: bc 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:11.685 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:11.686 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:11.686 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:11.686 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:11.742 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:126: TEST_profile_custom_to_builtin: client_res=0.300000 2026-03-08T22:42:11.742 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:127: TEST_profile_custom_to_builtin: echo 'Original client_res for osd.0 = 0.300000' 2026-03-08T22:42:11.742 INFO:tasks.workunit.client.0.vm07.stdout:Original client_res for osd.0 = 0.300000 2026-03-08T22:42:11.742 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:130: TEST_profile_custom_to_builtin: echo '0.300000 + 0.1' 2026-03-08T22:42:11.742 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:130: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:11.743 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:130: TEST_profile_custom_to_builtin: local client_res_new=.400000 2026-03-08T22:42:11.743 INFO:tasks.workunit.client.0.vm07.stdout:client_res_new = .400000 2026-03-08T22:42:11.743 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:131: TEST_profile_custom_to_builtin: echo 'client_res_new = .400000' 2026-03-08T22:42:11.743 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:132: TEST_profile_custom_to_builtin: ceph config set osd osd_mclock_scheduler_client_res .400000 2026-03-08T22:42:11.975 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:136: TEST_profile_custom_to_builtin: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:12.205 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:136: TEST_profile_custom_to_builtin: local res=0.400000 2026-03-08T22:42:12.205 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:137: TEST_profile_custom_to_builtin: echo '0.400000 != .400000' 2026-03-08T22:42:12.205 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:137: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:12.206 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:137: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: bc 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:12.207 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:12.208 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:12.208 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:12.208 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:12.267 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:143: TEST_profile_custom_to_builtin: res=0.400000 2026-03-08T22:42:12.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:144: TEST_profile_custom_to_builtin: echo '0.400000 != .400000' 2026-03-08T22:42:12.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:144: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:12.269 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:144: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:12.269 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:154: TEST_profile_custom_to_builtin: ceph tell osd.0 config set osd_mclock_profile high_recovery_ops 2026-03-08T22:42:12.381 INFO:tasks.workunit.client.0.vm07.stdout:{ 2026-03-08T22:42:12.381 INFO:tasks.workunit.client.0.vm07.stdout: "success": "" 2026-03-08T22:42:12.381 INFO:tasks.workunit.client.0.vm07.stdout:} 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: jq .osd_mclock_profile 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:12.389 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:12.390 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:12.390 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:12.448 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: eval 'mclock_profile="high_recovery_ops"' 2026-03-08T22:42:12.448 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:158: TEST_profile_custom_to_builtin: mclock_profile=high_recovery_ops 2026-03-08T22:42:12.448 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:160: TEST_profile_custom_to_builtin: test high_recovery_ops = high_recovery_ops 2026-03-08T22:42:12.448 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:165: TEST_profile_custom_to_builtin: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:12.677 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:165: TEST_profile_custom_to_builtin: local res=0.400000 2026-03-08T22:42:12.677 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:166: TEST_profile_custom_to_builtin: echo '0.400000 != .400000' 2026-03-08T22:42:12.677 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:166: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:12.678 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:166: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:12.678 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: bc 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:12.679 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:12.736 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:172: TEST_profile_custom_to_builtin: res=0.400000 2026-03-08T22:42:12.737 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:173: TEST_profile_custom_to_builtin: echo '0.400000 != .400000' 2026-03-08T22:42:12.737 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:173: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:12.739 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:173: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:12.739 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:178: TEST_profile_custom_to_builtin: ceph config rm osd osd_mclock_scheduler_client_res 2026-03-08T22:42:12.970 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:180: TEST_profile_custom_to_builtin: sleep 5 2026-03-08T22:42:17.974 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:185: TEST_profile_custom_to_builtin: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:18.198 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:185: TEST_profile_custom_to_builtin: res=0.000000 2026-03-08T22:42:18.198 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:186: TEST_profile_custom_to_builtin: echo '0.000000 != 0.0' 2026-03-08T22:42:18.198 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:186: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:18.199 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:186: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: bc 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: get_asok_path osd.0 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.200 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.201 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:18.201 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:18.201 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: CEPH_ARGS= 2026-03-08T22:42:18.201 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:18.263 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:193: TEST_profile_custom_to_builtin: res=0.300000 2026-03-08T22:42:18.263 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:194: TEST_profile_custom_to_builtin: echo '0.300000 != 0.300000' 2026-03-08T22:42:18.263 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:194: TEST_profile_custom_to_builtin: bc -l 2026-03-08T22:42:18.264 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:194: TEST_profile_custom_to_builtin: (( 0 )) 2026-03-08T22:42:18.264 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:198: TEST_profile_custom_to_builtin: teardown td/mclock-config 2026-03-08T22:42:18.264 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:18.264 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:18.264 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:18.265 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:18.265 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:18.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:18.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:18.265 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:18.382 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:18.382 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:18.383 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:18.383 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:18.384 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:18.384 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:18.384 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:18.385 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.385 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:18.385 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.385 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:18.386 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:18.387 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:18.387 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:18.392 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:18.393 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.393 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.393 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:34: run: teardown td/mclock-config 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:18.394 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:18.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:18.396 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:18.397 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:18.397 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:18.398 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:18.398 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:18.399 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:18.399 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.399 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:18.400 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.400 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:18.400 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:18.401 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:18.401 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:18.402 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:18.402 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.402 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:18.403 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:18.404 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:18.406 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:18.406 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:18.407 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:18.407 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:18.408 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:18.408 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:18.408 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:18.409 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.409 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:18.409 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:18.410 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.410 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:18.411 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:18.411 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:18.412 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:18.412 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.412 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:18.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:18.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:18.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:18.415 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:18.415 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.415 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.415 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:18.416 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_profile_disallow_builtin_params_modify td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:305: TEST_profile_disallow_builtin_params_modify: local dir=td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:307: TEST_profile_disallow_builtin_params_modify: setup td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:18.417 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:18.419 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:18.419 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:18.420 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:18.421 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:18.421 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:18.422 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:18.422 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:18.423 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.423 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:18.423 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:18.423 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:18.424 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:18.425 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:18.425 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:18.426 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:18.426 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.426 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.426 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:18.427 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:18.427 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:18.427 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:18.429 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:18.429 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.429 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.429 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:18.430 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:308: TEST_profile_disallow_builtin_params_modify: run_mon td/mclock-config a 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:42:18.431 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:42:18.495 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:18.496 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:18.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.532 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:18.533 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:18.533 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:42:18.583 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:18.704 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:309: TEST_profile_disallow_builtin_params_modify: run_mgr td/mclock-config x 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:42:18.705 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:19.105 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:19.106 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:19.107 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:311: TEST_profile_disallow_builtin_params_modify: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:19.130 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:42:19.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:42:19.134 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:19.135 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c991456f-bbfb-4fa3-8f16-e6236acd40fd 2026-03-08T22:42:19.135 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 c991456f-bbfb-4fa3-8f16-e6236acd40fd 2026-03-08T22:42:19.135 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 c991456f-bbfb-4fa3-8f16-e6236acd40fd' 2026-03-08T22:42:19.136 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:19.149 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBL+61phgj3CBAAttY4u2NWEML6xZnPLwoUIg== 2026-03-08T22:42:19.149 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBL+61phgj3CBAAttY4u2NWEML6xZnPLwoUIg=="}' 2026-03-08T22:42:19.149 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c991456f-bbfb-4fa3-8f16-e6236acd40fd -i td/mclock-config/0/new.json 2026-03-08T22:42:19.382 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:19.388 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:42:19.389 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQBL+61phgj3CBAAttY4u2NWEML6xZnPLwoUIg== --osd-uuid c991456f-bbfb-4fa3-8f16-e6236acd40fd 2026-03-08T22:42:19.412 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:19.411+0000 7f0d1a81c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:19.413 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:19.413+0000 7f0d1a81c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:19.416 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:19.417+0000 7f0d1a81c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:19.417 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:19.418+0000 7f0d1a81c780 -1 bdev(0x559ba3ddc800 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:19.417 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:19.418+0000 7f0d1a81c780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:42:21.909 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:42:21.909 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:21.911 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:42:21.911 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:21.911 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:22.206 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:42:22.207 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:22.207 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:42:22.207 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:22.207 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:22.209 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:22.225 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:22.225+0000 7fc7bff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:22.232 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:22.233+0000 7fc7bff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:22.234 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:22.234+0000 7fc7bff19780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:22.482 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:22.483 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:22.483 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:22.483 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:22.483 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:22.715 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:23.055 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:23.056+0000 7fc7bff19780 -1 Falling back to public interface 2026-03-08T22:42:23.716 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:23.717 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:23.717 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:42:23.717 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:23.718 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:23.718 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:23.949 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:24.173 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:24.174+0000 7fc7bff19780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:24.951 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:25.207 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:25.476 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:25.476+0000 7fc7bb6ba640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:26.210 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:26.440 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3985852077,v1:127.0.0.1:6803/3985852077] [v2:127.0.0.1:6804/3985852077,v1:127.0.0.1:6805/3985852077] exists,up c991456f-bbfb-4fa3-8f16-e6236acd40fd 2026-03-08T22:42:26.440 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:26.440 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:26.440 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:26.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:314: TEST_profile_disallow_builtin_params_modify: ceph config get osd.0 osd_mclock_profile 2026-03-08T22:42:26.670 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:314: TEST_profile_disallow_builtin_params_modify: local def_mclock_profile=balanced 2026-03-08T22:42:26.670 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:315: TEST_profile_disallow_builtin_params_modify: test balanced = balanced 2026-03-08T22:42:26.670 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:320: TEST_profile_disallow_builtin_params_modify: jq .osd_mclock_profile 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:320: TEST_profile_disallow_builtin_params_modify: get_asok_path osd.0 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:320: TEST_profile_disallow_builtin_params_modify: CEPH_ARGS= 2026-03-08T22:42:26.671 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:320: TEST_profile_disallow_builtin_params_modify: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:26.731 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:320: TEST_profile_disallow_builtin_params_modify: local 'cur_mclock_profile="high_recovery_ops"' 2026-03-08T22:42:26.731 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:321: TEST_profile_disallow_builtin_params_modify: eval echo '"high_recovery_ops"' 2026-03-08T22:42:26.731 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:321: TEST_profile_disallow_builtin_params_modify: echo high_recovery_ops 2026-03-08T22:42:26.731 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:321: TEST_profile_disallow_builtin_params_modify: cur_mclock_profile=high_recovery_ops 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:322: TEST_profile_disallow_builtin_params_modify: test high_recovery_ops = high_recovery_ops 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:324: TEST_profile_disallow_builtin_params_modify: options=('osd_mclock_scheduler_background_recovery_res' 'osd_mclock_scheduler_client_res') 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:324: TEST_profile_disallow_builtin_params_modify: declare -a options 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:327: TEST_profile_disallow_builtin_params_modify: local retries=10 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:328: TEST_profile_disallow_builtin_params_modify: local errors=0 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:329: TEST_profile_disallow_builtin_params_modify: for opt in "${options[@]}" 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: get_asok_path osd.0 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:26.732 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: jq .osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: CEPH_ARGS= 2026-03-08T22:42:26.733 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:26.794 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: local opt_val_orig=0.700000 2026-03-08T22:42:26.795 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:26.795 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: echo '0.700000 + 0.1' 2026-03-08T22:42:26.796 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: local opt_val_new=.800000 2026-03-08T22:42:26.796 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:335: TEST_profile_disallow_builtin_params_modify: ceph config set osd.0 osd_mclock_scheduler_background_recovery_res .800000 2026-03-08T22:42:27.056 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: expr 10 - 1 2026-03-08T22:42:27.057 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: seq 0 9 2026-03-08T22:42:27.058 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: for count in $(seq 0 $(expr $retries - 1)) 2026-03-08T22:42:27.059 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:340: TEST_profile_disallow_builtin_params_modify: errors=0 2026-03-08T22:42:27.059 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:341: TEST_profile_disallow_builtin_params_modify: sleep 2 2026-03-08T22:42:29.060 INFO:tasks.workunit.client.0.vm07.stdout:Check configuration values - Attempt#: 0 2026-03-08T22:42:29.060 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:343: TEST_profile_disallow_builtin_params_modify: echo 'Check configuration values - Attempt#: 0' 2026-03-08T22:42:29.060 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:345: TEST_profile_disallow_builtin_params_modify: ceph config get osd.0 osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:29.282 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:345: TEST_profile_disallow_builtin_params_modify: local res=0.000000 2026-03-08T22:42:29.283 INFO:tasks.workunit.client.0.vm07.stdout:Mon db (or default): osd.0 osd_mclock_scheduler_background_recovery_res = 0.000000 2026-03-08T22:42:29.283 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:346: TEST_profile_disallow_builtin_params_modify: echo 'Mon db (or default): osd.0 osd_mclock_scheduler_background_recovery_res = 0.000000' 2026-03-08T22:42:29.283 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: echo '0.000000 == .800000' 2026-03-08T22:42:29.283 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.284 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: (( 0 )) 2026-03-08T22:42:29.284 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: ceph config show osd.0 2026-03-08T22:42:29.284 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: awk '{ print $2 }' 2026-03-08T22:42:29.286 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: grep osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:29.287 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:29.516 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: res=.700000 2026-03-08T22:42:29.516 INFO:tasks.workunit.client.0.vm07.stdout:Running config: osd.0 osd_mclock_scheduler_background_recovery_res = .700000 2026-03-08T22:42:29.517 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:354: TEST_profile_disallow_builtin_params_modify: echo 'Running config: osd.0 osd_mclock_scheduler_background_recovery_res = .700000' 2026-03-08T22:42:29.517 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: echo '.700000 == .800000' 2026-03-08T22:42:29.517 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.518 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: echo '.700000 != 0.700000' 2026-03-08T22:42:29.518 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.519 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: (( 0 || 0 )) 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: get_asok_path osd.0 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:29.520 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:29.521 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: CEPH_ARGS= 2026-03-08T22:42:29.521 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:29.521 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: jq .osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:29.575 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: res=0.700000 2026-03-08T22:42:29.579 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:363: TEST_profile_disallow_builtin_params_modify: echo 'Values map: osd.0 osd_mclock_scheduler_background_recovery_res = 0.700000' 2026-03-08T22:42:29.579 INFO:tasks.workunit.client.0.vm07.stdout:Values map: osd.0 osd_mclock_scheduler_background_recovery_res = 0.700000 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: echo '0.700000 == .800000' 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: echo '0.700000 != 0.700000' 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: (( 0 || 0 )) 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:370: TEST_profile_disallow_builtin_params_modify: '[' 0 -eq 0 ']' 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:372: TEST_profile_disallow_builtin_params_modify: break 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:329: TEST_profile_disallow_builtin_params_modify: for opt in "${options[@]}" 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: get_asok_path osd.0 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:29.580 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:29.581 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:29.581 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:29.581 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: CEPH_ARGS= 2026-03-08T22:42:29.581 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:29.581 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:29.635 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:333: TEST_profile_disallow_builtin_params_modify: local opt_val_orig=0.300000 2026-03-08T22:42:29.636 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: echo '0.300000 + 0.1' 2026-03-08T22:42:29.636 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:29.637 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:334: TEST_profile_disallow_builtin_params_modify: local opt_val_new=.400000 2026-03-08T22:42:29.637 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:335: TEST_profile_disallow_builtin_params_modify: ceph config set osd.0 osd_mclock_scheduler_client_res .400000 2026-03-08T22:42:29.866 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: expr 10 - 1 2026-03-08T22:42:29.867 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: seq 0 9 2026-03-08T22:42:29.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:338: TEST_profile_disallow_builtin_params_modify: for count in $(seq 0 $(expr $retries - 1)) 2026-03-08T22:42:29.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:340: TEST_profile_disallow_builtin_params_modify: errors=0 2026-03-08T22:42:29.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:341: TEST_profile_disallow_builtin_params_modify: sleep 2 2026-03-08T22:42:31.870 INFO:tasks.workunit.client.0.vm07.stdout:Check configuration values - Attempt#: 0 2026-03-08T22:42:31.870 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:343: TEST_profile_disallow_builtin_params_modify: echo 'Check configuration values - Attempt#: 0' 2026-03-08T22:42:31.870 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:345: TEST_profile_disallow_builtin_params_modify: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:32.096 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:345: TEST_profile_disallow_builtin_params_modify: local res=0.000000 2026-03-08T22:42:32.096 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:346: TEST_profile_disallow_builtin_params_modify: echo 'Mon db (or default): osd.0 osd_mclock_scheduler_client_res = 0.000000' 2026-03-08T22:42:32.096 INFO:tasks.workunit.client.0.vm07.stdout:Mon db (or default): osd.0 osd_mclock_scheduler_client_res = 0.000000 2026-03-08T22:42:32.096 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: echo '0.000000 == .400000' 2026-03-08T22:42:32.097 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:32.098 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:347: TEST_profile_disallow_builtin_params_modify: (( 0 )) 2026-03-08T22:42:32.098 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: ceph config show osd.0 2026-03-08T22:42:32.098 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: awk '{ print $2 }' 2026-03-08T22:42:32.099 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: grep osd_mclock_scheduler_client_res 2026-03-08T22:42:32.101 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:32.335 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:353: TEST_profile_disallow_builtin_params_modify: res=.300000 2026-03-08T22:42:32.335 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:354: TEST_profile_disallow_builtin_params_modify: echo 'Running config: osd.0 osd_mclock_scheduler_client_res = .300000' 2026-03-08T22:42:32.335 INFO:tasks.workunit.client.0.vm07.stdout:Running config: osd.0 osd_mclock_scheduler_client_res = .300000 2026-03-08T22:42:32.335 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: echo '.300000 == .400000' 2026-03-08T22:42:32.335 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:32.336 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: echo '.300000 != 0.300000' 2026-03-08T22:42:32.336 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:32.337 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:356: TEST_profile_disallow_builtin_params_modify: (( 0 || 0 )) 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: bc 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: get_asok_path osd.0 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:32.338 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:32.339 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: CEPH_ARGS= 2026-03-08T22:42:32.339 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:32.393 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:362: TEST_profile_disallow_builtin_params_modify: res=0.300000 2026-03-08T22:42:32.393 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:363: TEST_profile_disallow_builtin_params_modify: echo 'Values map: osd.0 osd_mclock_scheduler_client_res = 0.300000' 2026-03-08T22:42:32.393 INFO:tasks.workunit.client.0.vm07.stdout:Values map: osd.0 osd_mclock_scheduler_client_res = 0.300000 2026-03-08T22:42:32.393 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: echo '0.300000 == .400000' 2026-03-08T22:42:32.393 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:32.395 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: echo '0.300000 != 0.300000' 2026-03-08T22:42:32.395 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: bc -l 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:365: TEST_profile_disallow_builtin_params_modify: (( 0 || 0 )) 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:370: TEST_profile_disallow_builtin_params_modify: '[' 0 -eq 0 ']' 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:372: TEST_profile_disallow_builtin_params_modify: break 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:380: TEST_profile_disallow_builtin_params_modify: teardown td/mclock-config 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:32.396 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:32.397 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:32.397 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:32.397 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:32.397 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:32.397 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:32.512 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:32.512 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:32.513 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:32.513 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:32.514 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:32.514 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:32.514 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:32.515 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.515 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:32.515 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:32.515 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.516 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:32.517 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:32.517 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:34: run: teardown td/mclock-config 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:32.524 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:32.525 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:32.526 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:32.526 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:32.527 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:32.527 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:32.528 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:32.528 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:32.528 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.528 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:32.529 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:32.529 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.529 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:32.530 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:32.530 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:32.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:32.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.531 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.532 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:32.533 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:32.534 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:32.534 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:32.535 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:32.535 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:32.536 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:32.536 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:32.536 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:32.537 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.537 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:32.537 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:32.538 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.538 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:32.539 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:32.539 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:32.540 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:32.540 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.540 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.540 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:32.541 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:32.541 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:32.541 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:32.542 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:32.542 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.543 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.543 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_profile_disallow_builtin_params_override td/mclock-config 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:384: TEST_profile_disallow_builtin_params_override: local dir=td/mclock-config 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:386: TEST_profile_disallow_builtin_params_override: setup td/mclock-config 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:32.544 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:32.545 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:32.547 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:32.547 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:32.548 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:32.548 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:32.549 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:32.549 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:32.549 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:32.550 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.550 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:32.550 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:32.550 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:32.551 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:32.552 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:32.552 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:32.553 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:32.553 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.553 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.553 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:32.554 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:32.554 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:32.554 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:32.555 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:32.555 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.555 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.555 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:32.556 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:387: TEST_profile_disallow_builtin_params_override: run_mon td/mclock-config a 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:42:32.557 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.583 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:32.584 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:42:32.615 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:32.616 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:32.618 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:32.618 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.618 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.618 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:32.619 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:32.619 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:32.672 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:32.673 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:388: TEST_profile_disallow_builtin_params_override: run_mgr td/mclock-config x 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:42:32.726 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.843 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:32.844 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:32.844 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:390: TEST_profile_disallow_builtin_params_override: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:32.866 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:32.867 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:42:32.868 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:32.869 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0a7ea34b-5806-489e-99c3-b5126b8cbe37 2026-03-08T22:42:32.869 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 0a7ea34b-5806-489e-99c3-b5126b8cbe37' 2026-03-08T22:42:32.869 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 0a7ea34b-5806-489e-99c3-b5126b8cbe37 2026-03-08T22:42:32.869 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:32.884 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBY+61pzz2uNBAA7UwjTEIFua/v51TZXuMr1Q== 2026-03-08T22:42:32.884 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBY+61pzz2uNBAA7UwjTEIFua/v51TZXuMr1Q=="}' 2026-03-08T22:42:32.884 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0a7ea34b-5806-489e-99c3-b5126b8cbe37 -i td/mclock-config/0/new.json 2026-03-08T22:42:33.015 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:33.025 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:42:33.026 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQBY+61pzz2uNBAA7UwjTEIFua/v51TZXuMr1Q== --osd-uuid 0a7ea34b-5806-489e-99c3-b5126b8cbe37 2026-03-08T22:42:33.048 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:33.048+0000 7f5ef7d2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.049 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:33.050+0000 7f5ef7d2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.051 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:33.052+0000 7f5ef7d2e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:33.052 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:33.052+0000 7f5ef7d2e780 -1 bdev(0x55852052fc00 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:33.052 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:33.052+0000 7f5ef7d2e780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:42:35.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:42:35.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:35.644 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:42:35.644 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:35.644 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:35.949 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:42:35.949 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:35.949 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:42:35.949 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:35.950 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:35.952 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:35.968 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:35.968+0000 7f5c0f80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:35.977 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:35.978+0000 7f5c0f80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:35.979 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:35.979+0000 7f5c0f80c780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:36.182 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:36.412 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:37.413 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:37.549 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:37.550+0000 7f5c0f80c780 -1 Falling back to public interface 2026-03-08T22:42:37.642 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:38.434 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:38.435+0000 7f5c0f80c780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:38.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:38.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:38.643 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:38.644 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:42:38.644 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:38.644 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:38.882 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stdout:3 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:39.885 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:40.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:41.113 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:41.113 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:41.113 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:42:41.113 INFO:tasks.workunit.client.0.vm07.stdout:4 2026-03-08T22:42:41.115 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:41.115 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:41.344 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1327502435,v1:127.0.0.1:6803/1327502435] [v2:127.0.0.1:6804/1327502435,v1:127.0.0.1:6805/1327502435] exists,up 0a7ea34b-5806-489e-99c3-b5126b8cbe37 2026-03-08T22:42:41.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:41.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:41.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:41.345 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:393: TEST_profile_disallow_builtin_params_override: ceph config get osd.0 osd_mclock_profile 2026-03-08T22:42:41.569 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:393: TEST_profile_disallow_builtin_params_override: local def_mclock_profile=balanced 2026-03-08T22:42:41.569 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:394: TEST_profile_disallow_builtin_params_override: test balanced = balanced 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:399: TEST_profile_disallow_builtin_params_override: jq .osd_mclock_profile 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:399: TEST_profile_disallow_builtin_params_override: get_asok_path osd.0 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:41.570 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:41.571 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:399: TEST_profile_disallow_builtin_params_override: CEPH_ARGS= 2026-03-08T22:42:41.571 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:399: TEST_profile_disallow_builtin_params_override: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_profile 2026-03-08T22:42:41.631 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:399: TEST_profile_disallow_builtin_params_override: local 'cur_mclock_profile="high_recovery_ops"' 2026-03-08T22:42:41.631 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:400: TEST_profile_disallow_builtin_params_override: eval echo '"high_recovery_ops"' 2026-03-08T22:42:41.631 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:400: TEST_profile_disallow_builtin_params_override: echo high_recovery_ops 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:400: TEST_profile_disallow_builtin_params_override: cur_mclock_profile=high_recovery_ops 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:401: TEST_profile_disallow_builtin_params_override: test high_recovery_ops = high_recovery_ops 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:403: TEST_profile_disallow_builtin_params_override: options=('osd_mclock_scheduler_background_recovery_res' 'osd_mclock_scheduler_client_res') 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:403: TEST_profile_disallow_builtin_params_override: declare -a options 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:406: TEST_profile_disallow_builtin_params_override: local retries=10 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:407: TEST_profile_disallow_builtin_params_override: local errors=0 2026-03-08T22:42:41.632 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:408: TEST_profile_disallow_builtin_params_override: for opt in "${options[@]}" 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: get_asok_path osd.0 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:41.633 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:41.634 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: jq .osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:41.634 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: CEPH_ARGS= 2026-03-08T22:42:41.634 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:41.689 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: local opt_val_orig=0.700000 2026-03-08T22:42:41.690 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: echo '0.700000 + 0.1' 2026-03-08T22:42:41.690 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:41.691 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: local opt_val_new=.800000 2026-03-08T22:42:41.691 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:414: TEST_profile_disallow_builtin_params_override: ceph tell osd.0 config set osd_mclock_scheduler_background_recovery_res .800000 2026-03-08T22:42:41.758 INFO:tasks.workunit.client.0.vm07.stdout:{ 2026-03-08T22:42:41.758 INFO:tasks.workunit.client.0.vm07.stdout: "success": "osd_mclock_scheduler_background_recovery_res = '' " 2026-03-08T22:42:41.758 INFO:tasks.workunit.client.0.vm07.stdout:} 2026-03-08T22:42:41.766 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: expr 10 - 1 2026-03-08T22:42:41.767 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: seq 0 9 2026-03-08T22:42:41.768 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: for count in $(seq 0 $(expr $retries - 1)) 2026-03-08T22:42:41.768 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:419: TEST_profile_disallow_builtin_params_override: errors=0 2026-03-08T22:42:41.768 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:420: TEST_profile_disallow_builtin_params_override: sleep 2 2026-03-08T22:42:43.770 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:422: TEST_profile_disallow_builtin_params_override: echo 'Check configuration values - Attempt#: 0' 2026-03-08T22:42:43.770 INFO:tasks.workunit.client.0.vm07.stdout:Check configuration values - Attempt#: 0 2026-03-08T22:42:43.770 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:424: TEST_profile_disallow_builtin_params_override: ceph config get osd.0 osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:44.007 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:424: TEST_profile_disallow_builtin_params_override: local res=0.000000 2026-03-08T22:42:44.007 INFO:tasks.workunit.client.0.vm07.stdout:Mon db (or default): osd.0 osd_mclock_scheduler_background_recovery_res = 0.000000 2026-03-08T22:42:44.008 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:425: TEST_profile_disallow_builtin_params_override: echo 'Mon db (or default): osd.0 osd_mclock_scheduler_background_recovery_res = 0.000000' 2026-03-08T22:42:44.008 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: echo '0.000000 == .800000' 2026-03-08T22:42:44.008 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.009 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: (( 0 )) 2026-03-08T22:42:44.009 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: ceph config show osd.0 2026-03-08T22:42:44.009 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: awk '{ print $2 }' 2026-03-08T22:42:44.011 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: grep osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:44.012 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:44.236 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: res=.700000 2026-03-08T22:42:44.236 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:433: TEST_profile_disallow_builtin_params_override: echo 'Running config: osd.0 osd_mclock_scheduler_background_recovery_res = .700000' 2026-03-08T22:42:44.236 INFO:tasks.workunit.client.0.vm07.stdout:Running config: osd.0 osd_mclock_scheduler_background_recovery_res = .700000 2026-03-08T22:42:44.237 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: echo '.700000 == .800000' 2026-03-08T22:42:44.237 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.238 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: echo '.700000 != 0.700000' 2026-03-08T22:42:44.238 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.239 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: (( 0 || 0 )) 2026-03-08T22:42:44.240 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:44.240 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: get_asok_path osd.0 2026-03-08T22:42:44.240 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:44.240 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: jq .osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: CEPH_ARGS= 2026-03-08T22:42:44.241 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_background_recovery_res 2026-03-08T22:42:44.295 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: res=0.700000 2026-03-08T22:42:44.295 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:442: TEST_profile_disallow_builtin_params_override: echo 'Values map: osd.0 osd_mclock_scheduler_background_recovery_res = 0.700000' 2026-03-08T22:42:44.295 INFO:tasks.workunit.client.0.vm07.stdout:Values map: osd.0 osd_mclock_scheduler_background_recovery_res = 0.700000 2026-03-08T22:42:44.295 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: echo '0.700000 == .800000' 2026-03-08T22:42:44.295 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.296 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: echo '0.700000 != 0.700000' 2026-03-08T22:42:44.297 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.297 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: (( 0 || 0 )) 2026-03-08T22:42:44.297 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:449: TEST_profile_disallow_builtin_params_override: '[' 0 -eq 0 ']' 2026-03-08T22:42:44.297 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:451: TEST_profile_disallow_builtin_params_override: break 2026-03-08T22:42:44.297 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:408: TEST_profile_disallow_builtin_params_override: for opt in "${options[@]}" 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: get_asok_path osd.0 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:44.298 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:44.299 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:44.299 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:44.299 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: CEPH_ARGS= 2026-03-08T22:42:44.299 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:44.352 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:412: TEST_profile_disallow_builtin_params_override: local opt_val_orig=0.300000 2026-03-08T22:42:44.352 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: echo '0.300000 + 0.1' 2026-03-08T22:42:44.353 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:44.354 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:413: TEST_profile_disallow_builtin_params_override: local opt_val_new=.400000 2026-03-08T22:42:44.354 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:414: TEST_profile_disallow_builtin_params_override: ceph tell osd.0 config set osd_mclock_scheduler_client_res .400000 2026-03-08T22:42:44.425 INFO:tasks.workunit.client.0.vm07.stdout:{ 2026-03-08T22:42:44.425 INFO:tasks.workunit.client.0.vm07.stdout: "success": "osd_mclock_scheduler_client_res = '' " 2026-03-08T22:42:44.425 INFO:tasks.workunit.client.0.vm07.stdout:} 2026-03-08T22:42:44.432 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: expr 10 - 1 2026-03-08T22:42:44.433 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: seq 0 9 2026-03-08T22:42:44.433 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:417: TEST_profile_disallow_builtin_params_override: for count in $(seq 0 $(expr $retries - 1)) 2026-03-08T22:42:44.433 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:419: TEST_profile_disallow_builtin_params_override: errors=0 2026-03-08T22:42:44.433 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:420: TEST_profile_disallow_builtin_params_override: sleep 2 2026-03-08T22:42:46.435 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:422: TEST_profile_disallow_builtin_params_override: echo 'Check configuration values - Attempt#: 0' 2026-03-08T22:42:46.435 INFO:tasks.workunit.client.0.vm07.stdout:Check configuration values - Attempt#: 0 2026-03-08T22:42:46.435 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:424: TEST_profile_disallow_builtin_params_override: ceph config get osd.0 osd_mclock_scheduler_client_res 2026-03-08T22:42:46.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:424: TEST_profile_disallow_builtin_params_override: local res=0.000000 2026-03-08T22:42:46.662 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:425: TEST_profile_disallow_builtin_params_override: echo 'Mon db (or default): osd.0 osd_mclock_scheduler_client_res = 0.000000' 2026-03-08T22:42:46.662 INFO:tasks.workunit.client.0.vm07.stdout:Mon db (or default): osd.0 osd_mclock_scheduler_client_res = 0.000000 2026-03-08T22:42:46.662 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: echo '0.000000 == .400000' 2026-03-08T22:42:46.662 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:46.663 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:426: TEST_profile_disallow_builtin_params_override: (( 0 )) 2026-03-08T22:42:46.664 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: ceph config show osd.0 2026-03-08T22:42:46.664 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: awk '{ print $2 }' 2026-03-08T22:42:46.665 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: grep osd_mclock_scheduler_client_res 2026-03-08T22:42:46.667 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:46.905 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:432: TEST_profile_disallow_builtin_params_override: res=.300000 2026-03-08T22:42:46.906 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:433: TEST_profile_disallow_builtin_params_override: echo 'Running config: osd.0 osd_mclock_scheduler_client_res = .300000' 2026-03-08T22:42:46.906 INFO:tasks.workunit.client.0.vm07.stdout:Running config: osd.0 osd_mclock_scheduler_client_res = .300000 2026-03-08T22:42:46.906 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: echo '.300000 == .400000' 2026-03-08T22:42:46.906 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:46.907 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: echo '.300000 != 0.300000' 2026-03-08T22:42:46.907 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:46.908 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:435: TEST_profile_disallow_builtin_params_override: (( 0 || 0 )) 2026-03-08T22:42:46.909 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: get_asok_path osd.0 2026-03-08T22:42:46.909 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: bc 2026-03-08T22:42:46.909 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:46.909 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: jq .osd_mclock_scheduler_client_res 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: CEPH_ARGS= 2026-03-08T22:42:46.910 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_mclock_scheduler_client_res 2026-03-08T22:42:46.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:441: TEST_profile_disallow_builtin_params_override: res=0.300000 2026-03-08T22:42:46.976 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:442: TEST_profile_disallow_builtin_params_override: echo 'Values map: osd.0 osd_mclock_scheduler_client_res = 0.300000' 2026-03-08T22:42:46.976 INFO:tasks.workunit.client.0.vm07.stdout:Values map: osd.0 osd_mclock_scheduler_client_res = 0.300000 2026-03-08T22:42:46.976 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: echo '0.300000 == .400000' 2026-03-08T22:42:46.976 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:46.978 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: echo '0.300000 != 0.300000' 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: bc -l 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:444: TEST_profile_disallow_builtin_params_override: (( 0 || 0 )) 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:449: TEST_profile_disallow_builtin_params_override: '[' 0 -eq 0 ']' 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:451: TEST_profile_disallow_builtin_params_override: break 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:459: TEST_profile_disallow_builtin_params_override: teardown td/mclock-config 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:46.979 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:46.980 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:46.980 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:47.088 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:47.088 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:47.089 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:47.089 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:47.090 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:47.090 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:47.091 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:47.091 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.091 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:47.092 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.092 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:47.092 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:47.093 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:47.093 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:47.098 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:47.098 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.098 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.099 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:34: run: teardown td/mclock-config 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:47.100 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:47.103 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:47.103 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:47.104 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:47.104 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:47.105 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:47.105 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:47.105 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:47.106 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.106 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:47.107 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:47.107 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.108 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:47.109 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:47.109 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:47.110 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:47.110 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.110 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.111 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:31: run: for func in $funcs 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:32: run: setup td/mclock-config 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:47.112 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:47.114 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:47.114 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:47.115 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:47.115 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:47.116 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:47.116 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:47.116 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:47.117 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.117 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:47.117 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:47.117 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.118 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:47.119 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:47.119 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:47.120 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:47.120 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.120 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.120 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:47.121 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:47.121 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:47.121 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:47.122 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:47.122 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.122 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.122 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: TEST_recovery_limit_adjustment_mclock td/mclock-config 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:202: TEST_recovery_limit_adjustment_mclock: local dir=td/mclock-config 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:204: TEST_recovery_limit_adjustment_mclock: setup td/mclock-config 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/mclock-config 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/mclock-config 2026-03-08T22:42:47.123 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:47.124 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:47.126 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:47.126 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:47.127 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:47.127 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:47.128 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:47.128 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:47.128 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:47.129 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.129 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:47.129 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:47.129 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:47.130 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:47.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:42:47.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:47.132 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:47.132 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.132 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.133 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:47.134 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:47.134 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:47.134 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/mclock-config 2026-03-08T22:42:47.135 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:42:47.135 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.135 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.135 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.48241 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/mclock-config 1' TERM HUP INT 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:205: TEST_recovery_limit_adjustment_mclock: run_mon td/mclock-config a 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/mclock-config 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/mclock-config/a 2026-03-08T22:42:47.137 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/mclock-config/a --run-dir=td/mclock-config 2026-03-08T22:42:47.172 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:42:47.172 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:47.172 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:47.173 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:47.173 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.173 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.173 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:47.173 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/mclock-config/a '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --mon-cluster-log-file=td/mclock-config/log --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:42:47.209 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:47.210 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:47.212 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:47.212 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.212 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.212 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:47.212 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:47.213 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get fsid 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:42:47.267 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-mon.a.asok 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:42:47.268 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.48241/ceph-mon.a.asok config get mon_host 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:206: TEST_recovery_limit_adjustment_mclock: run_mgr td/mclock-config x 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/mclock-config 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/mclock-config/x 2026-03-08T22:42:47.321 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:47.440 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:47.441 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/mclock-config/x '--log-file=td/mclock-config/$name.log' '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --run-dir=td/mclock-config '--pid-file=td/mclock-config/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:208: TEST_recovery_limit_adjustment_mclock: run_osd td/mclock-config 0 --osd_op_queue=mclock_scheduler 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/mclock-config 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/mclock-config/0 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 ' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/mclock-config/0' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/mclock-config/0/journal' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:42:47.462 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/mclock-config' 2026-03-08T22:42:47.463 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:42:47.463 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:42:47.463 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:42:47.463 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:42:47.463 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:47.464 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:47.464 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:47.464 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' 2026-03-08T22:42:47.464 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:42:47.464 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/mclock-config/$name.log' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/mclock-config/$name.pid' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_op_queue=mclock_scheduler 2026-03-08T22:42:47.465 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/mclock-config/0 2026-03-08T22:42:47.466 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:42:47.467 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c 2026-03-08T22:42:47.467 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c' 2026-03-08T22:42:47.467 INFO:tasks.workunit.client.0.vm07.stdout:add osd0 16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c 2026-03-08T22:42:47.467 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:42:47.480 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBn+61pZPquHBAAxFxFyojiVrA6tW32W4AnoA== 2026-03-08T22:42:47.480 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBn+61pZPquHBAAxFxFyojiVrA6tW32W4AnoA=="}' 2026-03-08T22:42:47.480 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c -i td/mclock-config/0/new.json 2026-03-08T22:42:47.613 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:47.635 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/mclock-config/0/new.json 2026-03-08T22:42:47.637 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler --mkfs --key AQBn+61pZPquHBAAxFxFyojiVrA6tW32W4AnoA== --osd-uuid 16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c 2026-03-08T22:42:47.657 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:47.657+0000 7f8f7741e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:47.658 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:47.659+0000 7f8f7741e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:47.660 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:47.660+0000 7f8f7741e780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:47.660 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:47.660+0000 7f8f7741e780 -1 bdev(0x5650bbcfcc00 td/mclock-config/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:42:47.660 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:47.660+0000 7f8f7741e780 -1 bluestore(td/mclock-config/0) _read_fsid unparsable uuid 2026-03-08T22:42:49.763 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/mclock-config/0/keyring 2026-03-08T22:42:49.763 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:42:49.764 INFO:tasks.workunit.client.0.vm07.stdout:adding osd0 key to auth repository 2026-03-08T22:42:49.764 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:42:49.764 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/mclock-config/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:42:49.895 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:42:49.896 INFO:tasks.workunit.client.0.vm07.stdout:start osd.0 2026-03-08T22:42:49.896 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=9ecd8d8d-2aa9-4009-8e34-696180900648 --auth-supported=none --mon-host=127.0.0.1:7124 --debug-mclock 20 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/mclock-config/0 --osd-journal=td/mclock-config/0/journal --chdir= --run-dir=td/mclock-config '--admin-socket=/tmp/ceph-asok.48241/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/mclock-config/$name.log' '--pid-file=td/mclock-config/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_op_queue=mclock_scheduler 2026-03-08T22:42:49.896 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:42:49.901 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:42:49.901 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:42:49.916 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:49.915+0000 7f8572e12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:49.918 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:49.919+0000 7f8572e12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:49.920 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:49.921+0000 7f8572e12780 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:42:50.131 INFO:tasks.workunit.client.0.vm07.stdout:0 2026-03-08T22:42:50.132 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:50.132 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:50.374 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:50.722 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:50.723+0000 7f8572e12780 -1 Falling back to public interface 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stdout:1 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:51.376 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:51.606 INFO:tasks.workunit.client.0.vm07.stderr:2026-03-08T22:42:51.607+0000 7f8572e12780 -1 osd.0 0 log_to_monitors true 2026-03-08T22:42:51.619 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:42:52.621 INFO:tasks.workunit.client.0.vm07.stdout:2 2026-03-08T22:42:52.621 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:42:52.621 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:42:52.621 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:42:52.622 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:42:52.622 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:42:52.857 INFO:tasks.workunit.client.0.vm07.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1001580981,v1:127.0.0.1:6803/1001580981] [v2:127.0.0.1:6804/1001580981,v1:127.0.0.1:6805/1001580981] exists,up 16f7fa3a-ddc2-4cc7-8140-2e54e54fb86c 2026-03-08T22:42:52.857 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:42:52.857 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:42:52.857 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:42:52.858 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:210: TEST_recovery_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:210: TEST_recovery_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:42:52.859 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:210: TEST_recovery_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_recovery_max_active 2026-03-08T22:42:52.920 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:210: TEST_recovery_limit_adjustment_mclock: local 'recoveries={"osd_recovery_max_active":"0"}' 2026-03-08T22:42:52.920 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:212: TEST_recovery_limit_adjustment_mclock: echo '{"osd_recovery_max_active":"0"}' 2026-03-08T22:42:52.920 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:212: TEST_recovery_limit_adjustment_mclock: grep --quiet osd_recovery_max_active 2026-03-08T22:42:52.921 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:217: TEST_recovery_limit_adjustment_mclock: ceph config set osd.0 osd_recovery_max_active 10 2026-03-08T22:42:53.162 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:218: TEST_recovery_limit_adjustment_mclock: sleep 2 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:220: TEST_recovery_limit_adjustment_mclock: get_asok_path osd.0 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.48241/ceph-osd.0.asok 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:220: TEST_recovery_limit_adjustment_mclock: CEPH_ARGS= 2026-03-08T22:42:55.164 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:220: TEST_recovery_limit_adjustment_mclock: ceph --format=json daemon /tmp/ceph-asok.48241/ceph-osd.0.asok config get osd_recovery_max_active 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:220: TEST_recovery_limit_adjustment_mclock: local 'max_recoveries={"osd_recovery_max_active":"10"}' 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:221: TEST_recovery_limit_adjustment_mclock: test '{"osd_recovery_max_active":"10"}' = '{"osd_recovery_max_active":"0"}' 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:221: TEST_recovery_limit_adjustment_mclock: return 1 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh:33: run: return 1 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2379: main: code=1 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/mclock-config 1 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/mclock-config 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=1 2026-03-08T22:42:55.219 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/mclock-config KILL 2026-03-08T22:42:55.220 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:42:55.220 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:42:55.220 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:42:55.220 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:42:55.220 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:42:55.337 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:42:55.337 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:42:55.338 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:42:55.338 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:42:55.340 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' xfs == btrfs ']' 2026-03-08T22:42:55.340 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:42:55.340 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:42:55.341 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:55.341 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:42:55.341 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:42:55.342 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:42:55.343 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:42:55.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 1 = 1 ']' 2026-03-08T22:42:55.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:190: teardown: '[' -n '' ']' 2026-03-08T22:42:55.344 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:194: teardown: mkdir -p /home/ubuntu/cephtest/archive/log 2026-03-08T22:42:55.345 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:195: teardown: mv td/mclock-config/mgr.x.log td/mclock-config/mon.a.log td/mclock-config/osd.0.log /home/ubuntu/cephtest/archive/log 2026-03-08T22:42:55.345 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/mclock-config 2026-03-08T22:42:55.350 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:42:55.350 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:42:55.350 INFO:tasks.workunit.client.0.vm07.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.48241 2026-03-08T22:42:55.350 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.48241 2026-03-08T22:42:55.351 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:42:55.351 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:42:55.351 INFO:tasks.workunit.client.0.vm07.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 1 2026-03-08T22:42:55.352 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:42:55.352 INFO:tasks.workunit:Stopping ['misc'] on client.0... 2026-03-08T22:42:55.352 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T22:42:55.800 ERROR:teuthology.run_tasks:Saw exception from tasks. Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 105, in run_tasks manager = run_one_task(taskname, ctx=ctx, config=config) File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 83, in run_one_task return task(**kwargs) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 144, in task _spawn_on_all_clients(ctx, refspec, all_tasks, config.get('env'), File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 292, in _spawn_on_all_clients with parallel() as p: File "/home/teuthos/teuthology/teuthology/parallel.py", line 84, in __exit__ for result in self: File "/home/teuthos/teuthology/teuthology/parallel.py", line 98, in __next__ resurrect_traceback(result) File "/home/teuthos/teuthology/teuthology/parallel.py", line 30, in resurrect_traceback raise exc.exc_info[1] File "/home/teuthos/teuthology/teuthology/parallel.py", line 23, in capture_traceback return func(*args, **kwargs) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/workunit.py", line 433, in _run_tests remote.run( File "/home/teuthos/teuthology/teuthology/orchestra/remote.py", line 575, in run r = self._runner(client=self.ssh, name=self.shortname, **kwargs) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 461, in run r.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed (workunit test misc/mclock-config.sh) on vm07 with status 1: 'mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh' 2026-03-08T22:42:55.801 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T22:42:55.803 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T22:42:55.803 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T22:42:55.836 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, ceph-volume, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-08T22:42:55.836 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-08T22:42:55.836 DEBUG:teuthology.orchestra.run.vm07:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse ceph-volume librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-08T22:42:55.836 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y remove $d || true 2026-03-08T22:42:55.836 DEBUG:teuthology.orchestra.run.vm07:> done 2026-03-08T22:42:56.056 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:42:56.056 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.056 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 39 M 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 39 M 2026-03-08T22:42:56.057 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:42:56.060 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:42:56.060 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:42:56.074 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:42:56.074 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:42:56.148 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-08T22:42:56.172 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.176 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:42:56.185 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:42:56.201 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:42:56.282 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:42:56.282 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:19.2.3-678.ge911bdeb.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.478 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:42:56.700 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:42:56.700 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.700 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:42:56.700 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.700 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 210 M 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:Remove 4 Packages 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 212 M 2026-03-08T22:42:56.701 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:42:56.704 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:42:56.704 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:42:56.728 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:42:56.729 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:42:56.794 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:42:56.799 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T22:42:56.801 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-08T22:42:56.804 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-08T22:42:56.819 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T22:42:56.917 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-08T22:42:56.921 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 1/4 2026-03-08T22:42:56.921 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-08T22:42:56.921 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-08T22:42:57.077 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-08T22:42:57.077 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.077 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:42:57.077 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:19.2.3-678.ge911bdeb.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-08T22:42:57.078 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T22:42:57.078 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.078 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:42:57.292 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 0 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 7.5 M 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 18 M 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: lua x86_64 5.4.4-4.el9 @appstream 593 k 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: lua-devel x86_64 5.4.4-4.el9 @crb 49 k 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: luarocks noarch 3.9.2-5.el9 @epel 692 k 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: unzip x86_64 6.0-59.el9 @baseos 389 k 2026-03-08T22:42:57.293 INFO:teuthology.orchestra.run.vm07.stdout: zip x86_64 3.0-35.el9 @baseos 724 k 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout:Remove 8 Packages 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 28 M 2026-03-08T22:42:57.294 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:42:57.297 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:42:57.297 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:42:57.323 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:42:57.323 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:42:57.369 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:42:57.374 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T22:42:57.378 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : luarocks-3.9.2-5.el9.noarch 2/8 2026-03-08T22:42:57.380 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : lua-devel-5.4.4-4.el9.x86_64 3/8 2026-03-08T22:42:57.383 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : zip-3.0-35.el9.x86_64 4/8 2026-03-08T22:42:57.385 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : unzip-6.0-59.el9.x86_64 5/8 2026-03-08T22:42:57.387 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : lua-5.4.4-4.el9.x86_64 6/8 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-08T22:42:57.409 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.410 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:42:57.418 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 7/8 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-08T22:42:57.443 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.445 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 8/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 1/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 3/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lua-5.4.4-4.el9.x86_64 4/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lua-devel-5.4.4-4.el9.x86_64 5/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : luarocks-3.9.2-5.el9.noarch 6/8 2026-03-08T22:42:57.555 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : unzip-6.0-59.el9.x86_64 7/8 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : zip-3.0-35.el9.x86_64 8/8 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: lua-5.4.4-4.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: lua-devel-5.4.4-4.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: luarocks-3.9.2-5.el9.noarch 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: unzip-6.0-59.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: zip-3.0-35.el9.x86_64 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.613 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:42:57.848 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout:=========================================================================================== 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout:=========================================================================================== 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 23 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 431 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.4 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 806 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 88 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 66 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 563 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 59 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-volume noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.4 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: abseil-cpp x86_64 20211102.0-4.el9 @epel 1.9 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 85 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 628 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 1.5 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 52 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 138 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: cryptsetup x86_64 2.8.1-3.el9 @baseos 770 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: grpc-data noarch 1.46.7-10.el9 @epel 13 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 425 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.6 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: pciutils x86_64 3.7.0-7.el9 @baseos 216 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: protobuf x86_64 3.14.0-17.el9 @appstream 3.5 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-compiler x86_64 3.14.0-17.el9 @crb 2.9 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 702 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-08T22:42:57.857 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio x86_64 1.46.7-10.el9 @epel 6.7 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-tools x86_64 1.46.7-10.el9 @epel 418 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-packaging noarch 20.9-5.el9 @appstream 248 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-protobuf noarch 3.14.0-17.el9 @appstream 1.4 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyparsing noarch 2.4.7-9.el9 @baseos 635 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: qatlib x86_64 25.08.0-2.el9 @appstream 639 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-service x86_64 25.08.0-2.el9 @appstream 69 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: qatzip-libs x86_64 1.3.1-1.el9 @appstream 148 k 2026-03-08T22:42:57.858 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout:=========================================================================================== 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout:Remove 103 Packages 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 613 M 2026-03-08T22:42:57.859 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:42:57.887 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:42:57.887 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:42:58.005 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:42:58.005 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:42:58.164 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:42:58.165 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T22:42:58.173 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 1/103 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.194 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:42:58.208 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:42:58.233 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 3/103 2026-03-08T22:42:58.233 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T22:42:58.290 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 4/103 2026-03-08T22:42:58.298 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/103 2026-03-08T22:42:58.303 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/103 2026-03-08T22:42:58.303 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:42:58.315 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:42:58.322 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/103 2026-03-08T22:42:58.326 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/103 2026-03-08T22:42:58.335 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-grpcio-tools-1.46.7-10.el9.x86_64 10/103 2026-03-08T22:42:58.339 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-grpcio-1.46.7-10.el9.x86_64 11/103 2026-03-08T22:42:58.361 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:42:58.362 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:58.362 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T22:42:58.362 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-08T22:42:58.362 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-08T22:42:58.362 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.368 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:42:58.377 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:42:58.394 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:42:58.394 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:58.394 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-volume@*.service" escaped as "ceph-volume@\x2a.service". 2026-03-08T22:42:58.394 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.403 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:42:58.413 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 13/103 2026-03-08T22:42:58.416 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 14/103 2026-03-08T22:42:58.420 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 15/103 2026-03-08T22:42:58.424 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 16/103 2026-03-08T22:42:58.432 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 17/103 2026-03-08T22:42:58.444 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 18/103 2026-03-08T22:42:58.450 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 19/103 2026-03-08T22:42:58.461 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 20/103 2026-03-08T22:42:58.472 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 21/103 2026-03-08T22:42:58.503 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 22/103 2026-03-08T22:42:58.510 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 23/103 2026-03-08T22:42:58.512 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 24/103 2026-03-08T22:42:58.521 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 25/103 2026-03-08T22:42:58.531 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 26/103 2026-03-08T22:42:58.531 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T22:42:58.539 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 27/103 2026-03-08T22:42:58.634 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 28/103 2026-03-08T22:42:58.650 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 29/103 2026-03-08T22:42:58.667 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:42:58.667 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-08T22:42:58.667 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.668 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:42:58.698 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 30/103 2026-03-08T22:42:58.714 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 31/103 2026-03-08T22:42:58.720 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 32/103 2026-03-08T22:42:58.723 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : protobuf-compiler-3.14.0-17.el9.x86_64 33/103 2026-03-08T22:42:58.726 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 34/103 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-08T22:42:58.748 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.749 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:42:58.764 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 35/103 2026-03-08T22:42:58.768 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 36/103 2026-03-08T22:42:58.771 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 37/103 2026-03-08T22:42:58.774 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-packaging-20.9-5.el9.noarch 38/103 2026-03-08T22:42:58.776 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 39/103 2026-03-08T22:42:58.779 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 40/103 2026-03-08T22:42:58.782 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 41/103 2026-03-08T22:42:58.786 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 42/103 2026-03-08T22:42:58.791 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 43/103 2026-03-08T22:42:58.841 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 44/103 2026-03-08T22:42:58.852 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 45/103 2026-03-08T22:42:58.854 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 46/103 2026-03-08T22:42:58.860 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 47/103 2026-03-08T22:42:58.862 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 48/103 2026-03-08T22:42:58.866 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 49/103 2026-03-08T22:42:58.869 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 50/103 2026-03-08T22:42:58.889 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:42:58.889 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T22:42:58.889 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T22:42:58.889 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:58.890 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:42:58.897 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:19.2.3-678.ge911bd 51/103 2026-03-08T22:42:58.898 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 52/103 2026-03-08T22:42:58.900 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 53/103 2026-03-08T22:42:58.903 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ply-3.11-14.el9.noarch 54/103 2026-03-08T22:42:58.905 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 55/103 2026-03-08T22:42:58.907 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 56/103 2026-03-08T22:42:58.910 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 57/103 2026-03-08T22:42:58.912 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 58/103 2026-03-08T22:42:58.915 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 59/103 2026-03-08T22:42:58.923 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyparsing-2.4.7-9.el9.noarch 60/103 2026-03-08T22:42:58.930 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 61/103 2026-03-08T22:42:58.935 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 62/103 2026-03-08T22:42:58.937 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 63/103 2026-03-08T22:42:58.940 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 64/103 2026-03-08T22:42:58.943 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 65/103 2026-03-08T22:42:58.949 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 66/103 2026-03-08T22:42:58.954 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 67/103 2026-03-08T22:42:58.960 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 68/103 2026-03-08T22:42:58.963 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 69/103 2026-03-08T22:42:58.969 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 70/103 2026-03-08T22:42:58.971 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 71/103 2026-03-08T22:42:58.974 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 72/103 2026-03-08T22:42:58.979 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : grpc-data-1.46.7-10.el9.noarch 73/103 2026-03-08T22:42:58.982 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-protobuf-3.14.0-17.el9.noarch 74/103 2026-03-08T22:42:58.985 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 75/103 2026-03-08T22:42:58.994 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 76/103 2026-03-08T22:42:58.999 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 77/103 2026-03-08T22:42:59.001 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 78/103 2026-03-08T22:42:59.004 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 79/103 2026-03-08T22:42:59.005 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 80/103 2026-03-08T22:42:59.010 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 81/103 2026-03-08T22:42:59.013 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 82/103 2026-03-08T22:42:59.036 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:42:59.036 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-08T22:42:59.036 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-08T22:42:59.036 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:42:59.044 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:42:59.075 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 83/103 2026-03-08T22:42:59.075 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T22:42:59.090 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 84/103 2026-03-08T22:42:59.097 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qatzip-libs-1.3.1-1.el9.x86_64 85/103 2026-03-08T22:42:59.101 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 86/103 2026-03-08T22:42:59.103 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 87/103 2026-03-08T22:42:59.103 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 88/103 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-08T22:43:04.909 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:04.923 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qatlib-25.08.0-2.el9.x86_64 89/103 2026-03-08T22:43:04.940 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:43:04.940 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:43:04.946 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: qatlib-service-25.08.0-2.el9.x86_64 90/103 2026-03-08T22:43:04.949 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 91/103 2026-03-08T22:43:04.951 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 92/103 2026-03-08T22:43:04.954 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : pciutils-3.7.0-7.el9.x86_64 93/103 2026-03-08T22:43:04.956 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 94/103 2026-03-08T22:43:04.956 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T22:43:04.971 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 95/103 2026-03-08T22:43:04.973 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 96/103 2026-03-08T22:43:04.976 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 97/103 2026-03-08T22:43:04.978 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 98/103 2026-03-08T22:43:04.980 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : protobuf-3.14.0-17.el9.x86_64 99/103 2026-03-08T22:43:04.985 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 100/103 2026-03-08T22:43:04.993 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : cryptsetup-2.8.1-3.el9.x86_64 101/103 2026-03-08T22:43:04.998 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : abseil-cpp-20211102.0-4.el9.x86_64 102/103 2026-03-08T22:43:04.998 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : abseil-cpp-20211102.0-4.el9.x86_64 1/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 3/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.e 4/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:19.2.3-678.ge911bd 5/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 6/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noar 7/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.no 8/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:19.2.3-678.ge911 9/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9 10/103 2026-03-08T22:43:05.103 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 11/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 12/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el 13/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 14/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 15/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cryptsetup-2.8.1-3.el9.x86_64 16/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 17/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 18/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 19/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 20/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : grpc-data-1.46.7-10.el9.noarch 21/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 22/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 23/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 24/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 25/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 26/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 27/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_ 28/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 29/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 30/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 31/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 32/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : pciutils-3.7.0-7.el9.x86_64 33/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : protobuf-3.14.0-17.el9.x86_64 34/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : protobuf-compiler-3.14.0-17.el9.x86_64 35/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 36/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 37/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 38/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 39/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 40/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 41/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x 42/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 43/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 44/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 45/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 46/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 47/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 48/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 49/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 50/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-grpcio-1.46.7-10.el9.x86_64 51/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-grpcio-tools-1.46.7-10.el9.x86_64 52/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 53/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 54/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 55/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 56/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 57/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 58/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 59/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 60/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 61/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 62/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 63/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 64/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 65/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 66/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 67/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 68/103 2026-03-08T22:43:05.104 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 69/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 70/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 71/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 72/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-packaging-20.9-5.el9.noarch 73/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 74/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 75/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 76/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 77/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-protobuf-3.14.0-17.el9.noarch 78/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 79/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 80/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 81/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 82/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyparsing-2.4.7-9.el9.noarch 83/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 84/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 85/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 86/103 2026-03-08T22:43:05.105 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 87/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 88/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 89/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 90/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 91/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 92/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 93/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 94/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 95/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 96/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 97/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 98/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 99/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatlib-25.08.0-2.el9.x86_64 100/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatlib-service-25.08.0-2.el9.x86_64 101/103 2026-03-08T22:43:05.106 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qatzip-libs-1.3.1-1.el9.x86_64 102/103 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 103/103 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: abseil-cpp-20211102.0-4.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ceph-volume-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: cryptsetup-2.8.1-3.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: grpc-data-1.46.7-10.el9.noarch 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T22:43:05.194 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: pciutils-3.7.0-7.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-3.14.0-17.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: protobuf-compiler-3.14.0-17.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-1.46.7-10.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-grpcio-tools-1.46.7-10.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-packaging-20.9-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-protobuf-3.14.0-17.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyparsing-2.4.7-9.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-08T22:43:05.195 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-25.08.0-2.el9.x86_64 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: qatlib-service-25.08.0-2.el9.x86_64 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: qatzip-libs-1.3.1-1.el9.x86_64 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.196 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:19.2.3-678.ge911bdeb.el9 @ceph-noarch 775 k 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.429 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 775 k 2026-03-08T22:43:05.430 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:43:05.431 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:43:05.431 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:43:05.432 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:43:05.433 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:43:05.448 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:43:05.449 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:43:05.585 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 1/1 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:19.2.3-678.ge911bdeb.el9.noarch 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:05.624 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:05.806 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-immutable-object-cache 2026-03-08T22:43:05.806 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:05.809 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:05.809 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:05.810 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:05.988 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr 2026-03-08T22:43:05.988 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:05.991 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:05.992 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:05.992 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:06.161 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-dashboard 2026-03-08T22:43:06.161 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:06.164 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:06.165 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:06.165 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:06.337 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-08T22:43:06.337 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:06.340 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:06.340 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:06.341 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:06.516 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-rook 2026-03-08T22:43:06.516 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:06.519 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:06.520 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:06.520 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:06.695 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-cephadm 2026-03-08T22:43:06.695 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:06.698 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:06.699 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:06.699 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:06.882 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.6 M 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 3.6 M 2026-03-08T22:43:06.883 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:43:06.885 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:43:06.885 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:43:06.894 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:43:06.894 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:43:06.922 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:43:06.939 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:43:07.007 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 1/1 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.052 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:07.258 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-volume 2026-03-08T22:43:07.258 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:07.261 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:07.262 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:07.262 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:07.444 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repo Size 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 456 k 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 153 k 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 610 k 2026-03-08T22:43:07.445 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:43:07.447 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:43:07.447 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:43:07.458 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:43:07.458 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:43:07.485 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:43:07.514 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:43:07.528 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:43:07.607 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:43:07.607 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 1/2 2026-03-08T22:43:07.654 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2/2 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.655 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repo Size 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 3.0 M 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 514 k 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 187 k 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Remove 3 Packages 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 3.7 M 2026-03-08T22:43:07.852 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:43:07.854 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:43:07.854 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:43:07.870 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:43:07.870 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:43:07.900 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:43:07.904 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T22:43:07.906 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T22:43:07.906 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:43:07.979 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:43:07.980 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 1/3 2026-03-08T22:43:07.980 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86 2/3 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 3/3 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.027 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:08.210 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: libcephfs-devel 2026-03-08T22:43:08.211 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:08.213 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:08.214 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:08.214 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:08.400 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 12 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 1.1 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 265 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 227 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 490 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: libnbd x86_64 1.20.3-4.el9 @appstream 453 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 13 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:19.2.3-678.ge911bdeb.el9 @ceph 19 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout:Remove 20 Packages 2026-03-08T22:43:08.402 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.403 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 79 M 2026-03-08T22:43:08.403 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-08T22:43:08.406 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-08T22:43:08.407 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-08T22:43:08.429 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-08T22:43:08.429 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-08T22:43:08.489 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-08T22:43:08.498 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 1/20 2026-03-08T22:43:08.510 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2/20 2026-03-08T22:43:08.515 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 3/20 2026-03-08T22:43:08.516 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T22:43:08.533 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 4/20 2026-03-08T22:43:08.545 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/20 2026-03-08T22:43:08.547 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 6/20 2026-03-08T22:43:08.549 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T22:43:08.562 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/20 2026-03-08T22:43:08.567 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/20 2026-03-08T22:43:08.567 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:43:08.585 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:43:08.585 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T22:43:08.585 INFO:teuthology.orchestra.run.vm07.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-08T22:43:08.585 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.606 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 11/20 2026-03-08T22:43:08.612 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/20 2026-03-08T22:43:08.616 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/20 2026-03-08T22:43:08.630 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/20 2026-03-08T22:43:08.644 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/20 2026-03-08T22:43:08.671 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libnbd-1.20.3-4.el9.x86_64 16/20 2026-03-08T22:43:08.679 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 17/20 2026-03-08T22:43:08.684 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 18/20 2026-03-08T22:43:08.687 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 19/20 2026-03-08T22:43:08.709 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 20/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libnbd-1.20.3-4.el9.x86_64 4/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 7/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 8/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 10/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 11/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 12/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 13/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 14/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 15/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 16/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 17/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 18/20 2026-03-08T22:43:08.765 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 19/20 2026-03-08T22:43:08.845 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 20/20 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: libnbd-1.20.3-4.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: librbd1-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:19.2.3-678.ge911bdeb.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-08T22:43:08.846 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:09.066 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: librbd1 2026-03-08T22:43:09.066 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:09.068 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:09.075 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:09.075 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:09.263 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rados 2026-03-08T22:43:09.263 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:09.265 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:09.266 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:09.266 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:09.478 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rgw 2026-03-08T22:43:09.478 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:09.480 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:09.481 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:09.481 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:09.657 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-cephfs 2026-03-08T22:43:09.657 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:09.659 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:09.660 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:09.660 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:09.836 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rbd 2026-03-08T22:43:09.836 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:09.838 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:09.838 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:09.838 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:10.009 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-fuse 2026-03-08T22:43:10.009 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:10.011 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:10.012 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:10.012 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:10.186 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-mirror 2026-03-08T22:43:10.187 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:10.188 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:10.189 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:10.189 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:10.361 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-nbd 2026-03-08T22:43:10.361 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-08T22:43:10.363 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-08T22:43:10.364 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-08T22:43:10.364 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-08T22:43:10.388 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-08T22:43:10.522 INFO:teuthology.orchestra.run.vm07.stdout:56 files removed 2026-03-08T22:43:10.545 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T22:43:10.571 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean expire-cache 2026-03-08T22:43:10.731 INFO:teuthology.orchestra.run.vm07.stdout:Cache was expired 2026-03-08T22:43:10.731 INFO:teuthology.orchestra.run.vm07.stdout:0 files removed 2026-03-08T22:43:10.747 DEBUG:teuthology.parallel:result is None 2026-03-08T22:43:10.747 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm07.local 2026-03-08T22:43:10.747 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-08T22:43:10.775 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-08T22:43:10.845 DEBUG:teuthology.parallel:result is None 2026-03-08T22:43:10.845 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T22:43:10.847 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T22:43:10.847 DEBUG:teuthology.orchestra.run.vm07:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:43:10.903 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:^+ ntp2.kernfusion.at 2 6 177 36 +1340us[+1262us] +/- 26ms 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:^+ ntp2.wup-de.hosts.301-mo> 2 6 177 35 -245us[ -324us] +/- 21ms 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:^+ mail.sassmann.nrw 2 6 177 99 +1129us[+1049us] +/- 46ms 2026-03-08T22:43:10.909 INFO:teuthology.orchestra.run.vm07.stdout:^* ntp3.lwlcom.net 1 6 177 35 -2091us[-2169us] +/- 16ms 2026-03-08T22:43:10.909 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T22:43:10.912 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T22:43:10.913 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T22:43:10.915 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T22:43:10.917 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T22:43:10.919 INFO:teuthology.task.internal:Duration was 329.582949 seconds 2026-03-08T22:43:10.919 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T22:43:10.921 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T22:43:10.921 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T22:43:10.996 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T22:43:11.193 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T22:43:11.193 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm07.local 2026-03-08T22:43:11.193 DEBUG:teuthology.orchestra.run.vm07:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T22:43:11.222 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T22:43:11.222 DEBUG:teuthology.orchestra.run.vm07:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:43:11.604 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T22:43:11.604 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T22:43:11.633 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:43:11.633 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:43:11.633 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T22:43:11.633 INFO:teuthology.orchestra.run.vm07.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T22:43:11.634 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T22:43:11.746 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 98.4% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T22:43:11.748 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T22:43:11.751 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T22:43:11.751 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T22:43:11.812 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T22:43:11.816 DEBUG:teuthology.orchestra.run.vm07:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:43:11.875 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = core 2026-03-08T22:43:11.891 DEBUG:teuthology.orchestra.run.vm07:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:43:11.948 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:43:11.949 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T22:43:11.951 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T22:43:11.951 DEBUG:teuthology.misc:Transferring archived files from vm07:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/278/remote/vm07 2026-03-08T22:43:11.951 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T22:43:12.049 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T22:43:12.049 DEBUG:teuthology.orchestra.run.vm07:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T22:43:12.105 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T22:43:12.109 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T22:43:12.110 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T22:43:12.112 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T22:43:12.112 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T22:43:12.166 INFO:teuthology.orchestra.run.vm07.stdout: 8532140 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 8 22:43 /home/ubuntu/cephtest 2026-03-08T22:43:12.166 INFO:teuthology.orchestra.run.vm07.stdout: 306715 0 drwxr-xr-x 3 ubuntu ubuntu 22 Mar 8 22:40 /home/ubuntu/cephtest/mnt.0 2026-03-08T22:43:12.166 INFO:teuthology.orchestra.run.vm07.stdout: 4342429 0 drwxr-xr-x 3 ubuntu ubuntu 17 Mar 8 22:41 /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T22:43:12.166 INFO:teuthology.orchestra.run.vm07.stdout: 75656161 0 drwxr-xr-x 3 ubuntu ubuntu 16 Mar 8 22:41 /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:43:12.166 INFO:teuthology.orchestra.run.vm07.stdout: 79746773 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 8 22:42 /home/ubuntu/cephtest/mnt.0/client.0/tmp/td 2026-03-08T22:43:12.167 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:43:12.167 INFO:teuthology.orchestra.run.vm07.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-08T22:43:12.167 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm07 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-08T22:43:12.167 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T22:43:12.172 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: CommandFailedError: Command failed on vm07 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-08T22:43:12.173 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{centos_latest} workloads/misc} duration: 329.58294916152954 failure_reason: 'Command failed (workunit test misc/mclock-config.sh) on vm07 with status 1: ''mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/misc/mclock-config.sh''' flavor: default owner: kyr sentry_event: null status: fail success: false 2026-03-08T22:43:12.173 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:43:12.200 INFO:teuthology.run:FAIL