2026-03-24T16:49:22.647 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-24T16:49:22.652 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-24T16:49:22.670 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621 branch: tentacle description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-zlib supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} email: null first_in_suite: false flavor: default job_id: '3621' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps no_nested_subset: false os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: client: rbd default format: 1 global: mon client directed command retry: 5 mon warn on pool no app: false ms inject socket failures: 5000 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: bluestore block size: 96636764160 bluestore compression algorithm: zlib bluestore compression mode: aggressive bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd shutdown pgref assert: true flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - \(OSD_SLOW_PING_TIME sha1: 70f8415b300f041766fa27faf7d5472699e32388 ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log global: osd crush chooseleaf type: 0 osd pool default pg num: 128 osd pool default pgp num: 128 osd pool default size: 2 mon: {} cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 70f8415b300f041766fa27faf7d5472699e32388 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 3051 sha1: 70f8415b300f041766fa27faf7d5472699e32388 sleep_before_teardown: 0 subset: 1/128 suite: rbd suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm01.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAEZ/pkw+OjvPpGlQZVoY9oPgppKVoxKt6C8jtlR5FHneXM4CmLpK6FmagGLmuIiAgExmNNElexPKds5DRmE6Yg= tasks: - install: null - ceph: null - workunit: clients: client.0: - rbd/cli_generic.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-20_22:04:26 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871 2026-03-24T16:49:22.670 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-24T16:49:22.670 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-24T16:49:22.670 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-24T16:49:22.671 INFO:teuthology.task.internal:Checking packages... 2026-03-24T16:49:22.671 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash '70f8415b300f041766fa27faf7d5472699e32388' 2026-03-24T16:49:22.671 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-24T16:49:22.671 INFO:teuthology.packaging:ref: None 2026-03-24T16:49:22.671 INFO:teuthology.packaging:tag: None 2026-03-24T16:49:22.671 INFO:teuthology.packaging:branch: tentacle 2026-03-24T16:49:22.671 INFO:teuthology.packaging:sha1: 70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T16:49:22.671 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=tentacle 2026-03-24T16:49:23.375 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-714-g147f7c6a-1jammy 2026-03-24T16:49:23.376 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-24T16:49:23.376 INFO:teuthology.task.internal:no buildpackages task found 2026-03-24T16:49:23.376 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-24T16:49:23.377 INFO:teuthology.task.internal:Saving configuration 2026-03-24T16:49:23.382 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-24T16:49:23.383 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-24T16:49:23.392 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm01.local', 'description': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-24 16:48:39.510459', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:01', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAEZ/pkw+OjvPpGlQZVoY9oPgppKVoxKt6C8jtlR5FHneXM4CmLpK6FmagGLmuIiAgExmNNElexPKds5DRmE6Yg='} 2026-03-24T16:49:23.392 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-24T16:49:23.393 INFO:teuthology.task.internal:roles: ubuntu@vm01.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-24T16:49:23.393 INFO:teuthology.run_tasks:Running task console_log... 2026-03-24T16:49:23.401 DEBUG:teuthology.task.console_log:vm01 does not support IPMI; excluding 2026-03-24T16:49:23.402 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7fe5045ccf70>, signals=[15]) 2026-03-24T16:49:23.402 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-24T16:49:23.403 INFO:teuthology.task.internal:Opening connections... 2026-03-24T16:49:23.403 DEBUG:teuthology.task.internal:connecting to ubuntu@vm01.local 2026-03-24T16:49:23.404 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-24T16:49:23.467 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-24T16:49:23.469 DEBUG:teuthology.orchestra.run.vm01:> uname -m 2026-03-24T16:49:23.606 INFO:teuthology.orchestra.run.vm01.stdout:x86_64 2026-03-24T16:49:23.606 DEBUG:teuthology.orchestra.run.vm01:> cat /etc/os-release 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:NAME="Ubuntu" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_ID="22.04" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:VERSION_CODENAME=jammy 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:ID=ubuntu 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:ID_LIKE=debian 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-24T16:49:23.651 INFO:teuthology.orchestra.run.vm01.stdout:UBUNTU_CODENAME=jammy 2026-03-24T16:49:23.651 INFO:teuthology.lock.ops:Updating vm01.local on lock server 2026-03-24T16:49:23.656 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-24T16:49:23.659 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-24T16:49:23.660 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-24T16:49:23.660 DEBUG:teuthology.orchestra.run.vm01:> test '!' -e /home/ubuntu/cephtest 2026-03-24T16:49:23.694 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-24T16:49:23.705 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-24T16:49:23.705 DEBUG:teuthology.orchestra.run.vm01:> test -z $(ls -A /var/lib/ceph) 2026-03-24T16:49:23.739 INFO:teuthology.orchestra.run.vm01.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-24T16:49:23.739 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-24T16:49:23.748 DEBUG:teuthology.orchestra.run.vm01:> test -e /ceph-qa-ready 2026-03-24T16:49:23.786 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:49:24.127 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-24T16:49:24.129 INFO:teuthology.task.internal:Creating test directory... 2026-03-24T16:49:24.129 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-24T16:49:24.133 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-24T16:49:24.134 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-24T16:49:24.136 INFO:teuthology.task.internal:Creating archive directory... 2026-03-24T16:49:24.136 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-24T16:49:24.181 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-24T16:49:24.182 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-24T16:49:24.182 DEBUG:teuthology.orchestra.run.vm01:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-24T16:49:24.227 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:49:24.227 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-24T16:49:24.278 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-24T16:49:24.283 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-24T16:49:24.284 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-24T16:49:24.286 INFO:teuthology.task.internal:Configuring sudo... 2026-03-24T16:49:24.286 DEBUG:teuthology.orchestra.run.vm01:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-24T16:49:24.337 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-24T16:49:24.340 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-24T16:49:24.341 DEBUG:teuthology.orchestra.run.vm01:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-24T16:49:24.383 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-24T16:49:24.428 DEBUG:teuthology.orchestra.run.vm01:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-24T16:49:24.475 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:49:24.475 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-24T16:49:24.524 DEBUG:teuthology.orchestra.run.vm01:> sudo service rsyslog restart 2026-03-24T16:49:24.586 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-24T16:49:24.587 INFO:teuthology.task.internal:Starting timer... 2026-03-24T16:49:24.588 INFO:teuthology.run_tasks:Running task pcp... 2026-03-24T16:49:24.591 INFO:teuthology.run_tasks:Running task selinux... 2026-03-24T16:49:24.593 INFO:teuthology.task.selinux:Excluding vm01: VMs are not yet supported 2026-03-24T16:49:24.593 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-24T16:49:24.593 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-24T16:49:24.593 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-24T16:49:24.593 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-24T16:49:24.595 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-24T16:49:24.595 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/kshtsk/ceph-cm-ansible.git 2026-03-24T16:49:24.597 INFO:teuthology.repo_utils:Fetching github.com_kshtsk_ceph-cm-ansible_main from origin 2026-03-24T16:49:25.091 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-24T16:49:25.097 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-24T16:49:25.097 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventoryk9ltoivw --limit vm01.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-24T16:51:43.680 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm01.local')] 2026-03-24T16:51:43.681 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm01.local' 2026-03-24T16:51:43.681 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm01.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-24T16:51:43.743 DEBUG:teuthology.orchestra.run.vm01:> true 2026-03-24T16:51:43.989 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm01.local' 2026-03-24T16:51:43.989 INFO:teuthology.run_tasks:Running task clock... 2026-03-24T16:51:43.992 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-24T16:51:43.992 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-24T16:51:43.992 DEBUG:teuthology.orchestra.run.vm01:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Command line: ntpd -gq 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: ---------------------------------------------------- 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: ntp-4 is maintained by Network Time Foundation, 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: corporation. Support and training for ntp-4 are 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: available at https://www.nwtime.org/support 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: ---------------------------------------------------- 2026-03-24T16:51:44.050 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: proto: precision = 0.029 usec (-25) 2026-03-24T16:51:44.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: basedate set to 2022-02-04 2026-03-24T16:51:44.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: gps base set to 2022-02-06 (week 2196) 2026-03-24T16:51:44.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-24T16:51:44.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-24T16:51:44.051 INFO:teuthology.orchestra.run.vm01.stderr:24 Mar 16:51:44 ntpd[16231]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 87 days ago 2026-03-24T16:51:44.052 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen and drop on 0 v6wildcard [::]:123 2026-03-24T16:51:44.052 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-24T16:51:44.052 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen normally on 2 lo 127.0.0.1:123 2026-03-24T16:51:44.052 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen normally on 3 ens3 192.168.123.101:123 2026-03-24T16:51:44.053 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen normally on 4 lo [::1]:123 2026-03-24T16:51:44.053 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:1%2]:123 2026-03-24T16:51:44.053 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:44 ntpd[16231]: Listening on routing socket on fd #22 for interface updates 2026-03-24T16:51:45.052 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:45 ntpd[16231]: Soliciting pool server 144.91.126.59 2026-03-24T16:51:46.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:46 ntpd[16231]: Soliciting pool server 176.9.44.212 2026-03-24T16:51:46.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:46 ntpd[16231]: Soliciting pool server 162.19.170.154 2026-03-24T16:51:47.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:47 ntpd[16231]: Soliciting pool server 85.215.122.93 2026-03-24T16:51:47.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:47 ntpd[16231]: Soliciting pool server 194.36.144.87 2026-03-24T16:51:47.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:47 ntpd[16231]: Soliciting pool server 139.162.152.20 2026-03-24T16:51:48.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:48 ntpd[16231]: Soliciting pool server 148.251.5.46 2026-03-24T16:51:48.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:48 ntpd[16231]: Soliciting pool server 46.38.233.159 2026-03-24T16:51:48.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:48 ntpd[16231]: Soliciting pool server 158.180.28.150 2026-03-24T16:51:48.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:48 ntpd[16231]: Soliciting pool server 185.16.60.96 2026-03-24T16:51:49.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:49 ntpd[16231]: Soliciting pool server 162.159.200.123 2026-03-24T16:51:49.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:49 ntpd[16231]: Soliciting pool server 37.221.199.157 2026-03-24T16:51:49.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:49 ntpd[16231]: Soliciting pool server 185.125.190.56 2026-03-24T16:51:50.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:50 ntpd[16231]: Soliciting pool server 185.125.190.57 2026-03-24T16:51:50.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:50 ntpd[16231]: Soliciting pool server 77.37.65.181 2026-03-24T16:51:50.051 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:50 ntpd[16231]: Soliciting pool server 176.9.157.155 2026-03-24T16:51:52.083 INFO:teuthology.orchestra.run.vm01.stdout:24 Mar 16:51:52 ntpd[16231]: ntpd: time slew +0.005281 s 2026-03-24T16:51:52.084 INFO:teuthology.orchestra.run.vm01.stdout:ntpd: time slew +0.005281s 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T16:51:52.107 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T16:51:52.107 INFO:teuthology.run_tasks:Running task install... 2026-03-24T16:51:52.148 DEBUG:teuthology.task.install:project ceph 2026-03-24T16:51:52.148 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-24T16:51:52.148 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-24T16:51:52.148 INFO:teuthology.task.install:Using flavor: default 2026-03-24T16:51:52.151 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-24T16:51:52.151 INFO:teuthology.task.install:extra packages: [] 2026-03-24T16:51:52.152 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-key list | grep Ceph 2026-03-24T16:51:52.195 INFO:teuthology.orchestra.run.vm01.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-24T16:51:52.216 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-24T16:51:52.216 INFO:teuthology.orchestra.run.vm01.stdout:uid [ unknown] Ceph.com (release key) 2026-03-24T16:51:52.216 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-24T16:51:52.216 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-jmespath, python3-xmltodict, s3cmd on remote deb x86_64 2026-03-24T16:51:52.216 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T16:51:52.790 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default/ 2026-03-24T16:51:52.791 INFO:teuthology.task.install.deb:Package version is 20.2.0-712-g70f8415b-1jammy 2026-03-24T16:51:53.331 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:51:53.331 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-24T16:51:53.339 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-24T16:51:53.459 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-24T16:51:53.460 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-24T16:51:53.463 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-24T16:51:53.471 INFO:teuthology.orchestra.run.vm01.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-24T16:51:54.023 INFO:teuthology.orchestra.run.vm01.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy InRelease 2026-03-24T16:51:54.138 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release [7680 B] 2026-03-24T16:51:54.254 INFO:teuthology.orchestra.run.vm01.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-24T16:51:54.370 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.8 kB] 2026-03-24T16:51:54.576 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 26.5 kB in 1s (24.9 kB/s) 2026-03-24T16:51:55.327 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T16:51:55.340 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=20.2.0-712-g70f8415b-1jammy cephadm=20.2.0-712-g70f8415b-1jammy ceph-mds=20.2.0-712-g70f8415b-1jammy ceph-mgr=20.2.0-712-g70f8415b-1jammy ceph-common=20.2.0-712-g70f8415b-1jammy ceph-fuse=20.2.0-712-g70f8415b-1jammy ceph-test=20.2.0-712-g70f8415b-1jammy ceph-volume=20.2.0-712-g70f8415b-1jammy radosgw=20.2.0-712-g70f8415b-1jammy python3-rados=20.2.0-712-g70f8415b-1jammy python3-rgw=20.2.0-712-g70f8415b-1jammy python3-cephfs=20.2.0-712-g70f8415b-1jammy python3-rbd=20.2.0-712-g70f8415b-1jammy libcephfs2=20.2.0-712-g70f8415b-1jammy libcephfs-dev=20.2.0-712-g70f8415b-1jammy librados2=20.2.0-712-g70f8415b-1jammy librbd1=20.2.0-712-g70f8415b-1jammy rbd-fuse=20.2.0-712-g70f8415b-1jammy 2026-03-24T16:51:55.379 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T16:51:55.582 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T16:51:55.582 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout:The following additional packages will be installed: 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-24T16:51:55.769 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-daemon libcephfs-proxy2 libdouble-conversion3 libfuse2 libjq1 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: liblttng-ust1 libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 librgw2 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: libsqlite3-mod-ceph libthrift-0.16.0 nvme-cli python-asyncssh-doc 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-argparse 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-iniconfig python3-jaraco.classes python3-jaraco.collections 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-pluggy python3-portend 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-pytest python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-toml python3-wcwidth 2026-03-24T16:51:55.770 INFO:teuthology.orchestra.run.vm01.stdout: python3-webob python3-websocket python3-zc.lockfile qttranslations5-l10n 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat xmlstarlet 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout:Suggested packages: 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: python3-influxdb liblua5.3-dev luarocks python-natsort-doc python-psutil-doc 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: subversion python-pygments-doc ttf-bitstream-vera python3-paste python3-dap 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: python-sklearn-doc ipython3 python-webob-doc gsmartcontrol smart-notifier 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: mailx | mailutils 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout:Recommended packages: 2026-03-24T16:51:55.771 INFO:teuthology.orchestra.run.vm01.stdout: btrfs-tools 2026-03-24T16:51:55.814 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-daemon libcephfs-dev libcephfs-proxy2 libcephfs2 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 libnbd0 liboath0 2026-03-24T16:51:55.815 INFO:teuthology.orchestra.run.vm01.stdout: libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 librgw2 libsqlite3-mod-ceph libthrift-0.16.0 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-natsort 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-pluggy python3-portend python3-prettytable python3-psutil python3-py 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-pygments python3-pytest python3-rados python3-rbd python3-repoze.lru 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-rgw python3-routes python3-rsa 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-toml python3-wcwidth python3-webob 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse 2026-03-24T16:51:55.816 INFO:teuthology.orchestra.run.vm01.stdout: smartmontools socat xmlstarlet 2026-03-24T16:51:55.817 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be upgraded: 2026-03-24T16:51:55.817 INFO:teuthology.orchestra.run.vm01.stdout: librados2 librbd1 2026-03-24T16:51:56.041 INFO:teuthology.orchestra.run.vm01.stdout:2 upgraded, 85 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T16:51:56.041 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 281 MB of archives. 2026-03-24T16:51:56.041 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1092 MB of additional disk space will be used. 2026-03-24T16:51:56.041 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-24T16:51:56.437 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 20.2.0-712-g70f8415b-1jammy [2867 kB] 2026-03-24T16:51:56.563 INFO:teuthology.orchestra.run.vm01.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-24T16:51:56.578 INFO:teuthology.orchestra.run.vm01.stdout:Get:4 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-24T16:51:56.679 INFO:teuthology.orchestra.run.vm01.stdout:Get:5 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-24T16:51:57.072 INFO:teuthology.orchestra.run.vm01.stdout:Get:6 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-24T16:51:57.096 INFO:teuthology.orchestra.run.vm01.stdout:Get:7 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-24T16:51:57.160 INFO:teuthology.orchestra.run.vm01.stdout:Get:8 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-24T16:51:57.183 INFO:teuthology.orchestra.run.vm01.stdout:Get:9 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-24T16:51:57.186 INFO:teuthology.orchestra.run.vm01.stdout:Get:10 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-24T16:51:57.187 INFO:teuthology.orchestra.run.vm01.stdout:Get:11 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-24T16:51:57.187 INFO:teuthology.orchestra.run.vm01.stdout:Get:12 http://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-24T16:51:57.218 INFO:teuthology.orchestra.run.vm01.stdout:Get:13 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-24T16:51:57.219 INFO:teuthology.orchestra.run.vm01.stdout:Get:14 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-24T16:51:57.220 INFO:teuthology.orchestra.run.vm01.stdout:Get:15 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-24T16:51:57.284 INFO:teuthology.orchestra.run.vm01.stdout:Get:16 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-24T16:51:57.284 INFO:teuthology.orchestra.run.vm01.stdout:Get:17 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-24T16:51:57.286 INFO:teuthology.orchestra.run.vm01.stdout:Get:18 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 20.2.0-712-g70f8415b-1jammy [3583 kB] 2026-03-24T16:51:57.386 INFO:teuthology.orchestra.run.vm01.stdout:Get:19 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-24T16:51:57.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:20 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-24T16:51:57.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:21 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-24T16:51:57.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:22 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-24T16:51:57.387 INFO:teuthology.orchestra.run.vm01.stdout:Get:23 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-24T16:51:57.392 INFO:teuthology.orchestra.run.vm01.stdout:Get:24 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-24T16:51:57.392 INFO:teuthology.orchestra.run.vm01.stdout:Get:25 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-24T16:51:57.394 INFO:teuthology.orchestra.run.vm01.stdout:Get:26 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-24T16:51:57.489 INFO:teuthology.orchestra.run.vm01.stdout:Get:27 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-24T16:51:57.489 INFO:teuthology.orchestra.run.vm01.stdout:Get:28 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-24T16:51:57.532 INFO:teuthology.orchestra.run.vm01.stdout:Get:29 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 20.2.0-712-g70f8415b-1jammy [829 kB] 2026-03-24T16:51:57.573 INFO:teuthology.orchestra.run.vm01.stdout:Get:30 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 20.2.0-712-g70f8415b-1jammy [364 kB] 2026-03-24T16:51:57.592 INFO:teuthology.orchestra.run.vm01.stdout:Get:31 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-24T16:51:57.638 INFO:teuthology.orchestra.run.vm01.stdout:Get:32 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-24T16:51:57.641 INFO:teuthology.orchestra.run.vm01.stdout:Get:33 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-24T16:51:57.642 INFO:teuthology.orchestra.run.vm01.stdout:Get:34 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-24T16:51:57.647 INFO:teuthology.orchestra.run.vm01.stdout:Get:35 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 20.2.0-712-g70f8415b-1jammy [32.8 kB] 2026-03-24T16:51:57.648 INFO:teuthology.orchestra.run.vm01.stdout:Get:36 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 20.2.0-712-g70f8415b-1jammy [184 kB] 2026-03-24T16:51:57.654 INFO:teuthology.orchestra.run.vm01.stdout:Get:37 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 20.2.0-712-g70f8415b-1jammy [83.8 kB] 2026-03-24T16:51:57.654 INFO:teuthology.orchestra.run.vm01.stdout:Get:38 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 20.2.0-712-g70f8415b-1jammy [341 kB] 2026-03-24T16:51:57.810 INFO:teuthology.orchestra.run.vm01.stdout:Get:39 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 20.2.0-712-g70f8415b-1jammy [8697 kB] 2026-03-24T16:51:57.818 INFO:teuthology.orchestra.run.vm01.stdout:Get:40 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-24T16:51:57.818 INFO:teuthology.orchestra.run.vm01.stdout:Get:41 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-24T16:51:57.818 INFO:teuthology.orchestra.run.vm01.stdout:Get:42 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-24T16:51:57.819 INFO:teuthology.orchestra.run.vm01.stdout:Get:43 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-24T16:51:57.819 INFO:teuthology.orchestra.run.vm01.stdout:Get:44 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-24T16:51:57.819 INFO:teuthology.orchestra.run.vm01.stdout:Get:45 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-24T16:51:57.820 INFO:teuthology.orchestra.run.vm01.stdout:Get:46 http://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-24T16:51:57.821 INFO:teuthology.orchestra.run.vm01.stdout:Get:47 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-24T16:51:57.822 INFO:teuthology.orchestra.run.vm01.stdout:Get:48 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-24T16:51:57.921 INFO:teuthology.orchestra.run.vm01.stdout:Get:49 http://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-24T16:51:57.925 INFO:teuthology.orchestra.run.vm01.stdout:Get:50 http://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-24T16:51:57.928 INFO:teuthology.orchestra.run.vm01.stdout:Get:51 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-24T16:51:58.097 INFO:teuthology.orchestra.run.vm01.stdout:Get:52 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-24T16:51:58.099 INFO:teuthology.orchestra.run.vm01.stdout:Get:53 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-24T16:51:58.099 INFO:teuthology.orchestra.run.vm01.stdout:Get:54 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-24T16:51:58.099 INFO:teuthology.orchestra.run.vm01.stdout:Get:55 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-24T16:51:58.099 INFO:teuthology.orchestra.run.vm01.stdout:Get:56 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-24T16:51:58.100 INFO:teuthology.orchestra.run.vm01.stdout:Get:57 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-24T16:51:58.103 INFO:teuthology.orchestra.run.vm01.stdout:Get:58 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-24T16:51:58.127 INFO:teuthology.orchestra.run.vm01.stdout:Get:59 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-24T16:51:58.131 INFO:teuthology.orchestra.run.vm01.stdout:Get:60 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-24T16:51:58.229 INFO:teuthology.orchestra.run.vm01.stdout:Get:61 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-24T16:51:58.230 INFO:teuthology.orchestra.run.vm01.stdout:Get:62 http://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-24T16:51:58.254 INFO:teuthology.orchestra.run.vm01.stdout:Get:63 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-24T16:51:58.412 INFO:teuthology.orchestra.run.vm01.stdout:Get:64 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 20.2.0-712-g70f8415b-1jammy [112 kB] 2026-03-24T16:51:58.412 INFO:teuthology.orchestra.run.vm01.stdout:Get:65 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 20.2.0-712-g70f8415b-1jammy [261 kB] 2026-03-24T16:51:58.413 INFO:teuthology.orchestra.run.vm01.stdout:Get:66 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 20.2.0-712-g70f8415b-1jammy [29.3 MB] 2026-03-24T16:52:00.588 INFO:teuthology.orchestra.run.vm01.stdout:Get:67 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 20.2.0-712-g70f8415b-1jammy [5415 kB] 2026-03-24T16:52:01.030 INFO:teuthology.orchestra.run.vm01.stdout:Get:68 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 20.2.0-712-g70f8415b-1jammy [246 kB] 2026-03-24T16:52:01.033 INFO:teuthology.orchestra.run.vm01.stdout:Get:69 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 20.2.0-712-g70f8415b-1jammy [124 kB] 2026-03-24T16:52:01.035 INFO:teuthology.orchestra.run.vm01.stdout:Get:70 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 20.2.0-712-g70f8415b-1jammy [906 kB] 2026-03-24T16:52:01.048 INFO:teuthology.orchestra.run.vm01.stdout:Get:71 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 20.2.0-712-g70f8415b-1jammy [6399 kB] 2026-03-24T16:52:01.520 INFO:teuthology.orchestra.run.vm01.stdout:Get:72 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 20.2.0-712-g70f8415b-1jammy [21.7 MB] 2026-03-24T16:52:02.993 INFO:teuthology.orchestra.run.vm01.stdout:Get:73 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 20.2.0-712-g70f8415b-1jammy [14.1 kB] 2026-03-24T16:52:03.031 INFO:teuthology.orchestra.run.vm01.stdout:Get:74 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 20.2.0-712-g70f8415b-1jammy [955 kB] 2026-03-24T16:52:03.067 INFO:teuthology.orchestra.run.vm01.stdout:Get:75 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 20.2.0-712-g70f8415b-1jammy [2341 kB] 2026-03-24T16:52:03.208 INFO:teuthology.orchestra.run.vm01.stdout:Get:76 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 20.2.0-712-g70f8415b-1jammy [1049 kB] 2026-03-24T16:52:03.304 INFO:teuthology.orchestra.run.vm01.stdout:Get:77 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 20.2.0-712-g70f8415b-1jammy [179 kB] 2026-03-24T16:52:03.307 INFO:teuthology.orchestra.run.vm01.stdout:Get:78 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 20.2.0-712-g70f8415b-1jammy [45.5 MB] 2026-03-24T16:52:06.595 INFO:teuthology.orchestra.run.vm01.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 20.2.0-712-g70f8415b-1jammy [8625 kB] 2026-03-24T16:52:06.965 INFO:teuthology.orchestra.run.vm01.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 20.2.0-712-g70f8415b-1jammy [14.2 kB] 2026-03-24T16:52:06.965 INFO:teuthology.orchestra.run.vm01.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 20.2.0-712-g70f8415b-1jammy [99.5 MB] 2026-03-24T16:52:14.656 INFO:teuthology.orchestra.run.vm01.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 20.2.0-712-g70f8415b-1jammy [135 kB] 2026-03-24T16:52:14.657 INFO:teuthology.orchestra.run.vm01.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-daemon amd64 20.2.0-712-g70f8415b-1jammy [43.3 kB] 2026-03-24T16:52:14.657 INFO:teuthology.orchestra.run.vm01.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-proxy2 amd64 20.2.0-712-g70f8415b-1jammy [30.7 kB] 2026-03-24T16:52:14.657 INFO:teuthology.orchestra.run.vm01.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 20.2.0-712-g70f8415b-1jammy [41.5 kB] 2026-03-24T16:52:14.657 INFO:teuthology.orchestra.run.vm01.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 20.2.0-712-g70f8415b-1jammy [25.1 MB] 2026-03-24T16:52:17.364 INFO:teuthology.orchestra.run.vm01.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 20.2.0-712-g70f8415b-1jammy [97.9 kB] 2026-03-24T16:52:17.683 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 281 MB in 22s (13.0 MB/s) 2026-03-24T16:52:17.710 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-24T16:52:17.762 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 119262 files and directories currently installed.) 2026-03-24T16:52:17.764 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../00-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-24T16:52:17.766 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T16:52:17.788 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-24T16:52:17.794 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../01-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-24T16:52:17.795 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T16:52:17.810 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-24T16:52:17.817 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../02-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-24T16:52:17.818 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T16:52:17.841 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-24T16:52:17.848 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../03-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T16:52:17.853 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:17.944 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-24T16:52:17.950 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../04-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T16:52:17.951 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:17.971 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-24T16:52:17.977 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../05-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T16:52:17.978 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:18.007 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-24T16:52:18.014 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../06-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-24T16:52:18.015 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T16:52:18.041 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../07-librbd1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.044 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librbd1 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-24T16:52:18.137 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../08-librados2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.140 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librados2 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-24T16:52:18.212 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libnbd0. 2026-03-24T16:52:18.220 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../09-libnbd0_1.10.5-1_amd64.deb ... 2026-03-24T16:52:18.220 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-24T16:52:18.238 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs2. 2026-03-24T16:52:18.244 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../10-libcephfs2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.245 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.271 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rados. 2026-03-24T16:52:18.277 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../11-python3-rados_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.278 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.299 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-24T16:52:18.305 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../12-python3-ceph-argparse_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:18.306 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.322 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cephfs. 2026-03-24T16:52:18.328 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../13-python3-cephfs_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.329 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.348 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-24T16:52:18.355 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../14-python3-ceph-common_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:18.356 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.380 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-24T16:52:18.387 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../15-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-24T16:52:18.388 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T16:52:18.409 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-prettytable. 2026-03-24T16:52:18.417 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../16-python3-prettytable_2.5.0-2_all.deb ... 2026-03-24T16:52:18.419 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-24T16:52:18.436 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rbd. 2026-03-24T16:52:18.444 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../17-python3-rbd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.445 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.467 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-24T16:52:18.474 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../18-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-24T16:52:18.475 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T16:52:18.500 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package librgw2. 2026-03-24T16:52:18.507 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../19-librgw2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.508 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.718 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rgw. 2026-03-24T16:52:18.724 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../20-python3-rgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.725 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.745 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-24T16:52:18.752 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../21-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-24T16:52:18.753 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T16:52:18.771 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libradosstriper1. 2026-03-24T16:52:18.778 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../22-libradosstriper1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.779 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:18.803 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-common. 2026-03-24T16:52:18.810 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../23-ceph-common_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:18.811 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:19.533 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-base. 2026-03-24T16:52:19.539 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../24-ceph-base_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:19.545 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:19.671 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-24T16:52:19.678 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../25-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-24T16:52:19.679 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-24T16:52:19.697 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cheroot. 2026-03-24T16:52:19.703 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../26-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-24T16:52:19.704 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T16:52:19.728 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-24T16:52:19.735 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../27-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-24T16:52:19.736 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-24T16:52:19.756 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-24T16:52:19.763 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../28-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-24T16:52:19.764 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-24T16:52:19.783 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-24T16:52:19.790 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../29-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-24T16:52:19.791 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-24T16:52:19.810 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-tempora. 2026-03-24T16:52:19.815 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../30-python3-tempora_4.1.2-1_all.deb ... 2026-03-24T16:52:19.816 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-24T16:52:19.836 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-portend. 2026-03-24T16:52:19.842 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../31-python3-portend_3.0.0-1_all.deb ... 2026-03-24T16:52:19.843 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-24T16:52:19.859 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-24T16:52:19.866 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../32-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-24T16:52:19.867 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-24T16:52:19.886 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-24T16:52:19.891 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../33-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-24T16:52:19.892 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-24T16:52:19.924 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-natsort. 2026-03-24T16:52:19.931 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../34-python3-natsort_8.0.2-1_all.deb ... 2026-03-24T16:52:19.932 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-24T16:52:19.949 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-24T16:52:19.955 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../35-ceph-mgr-modules-core_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:19.956 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:19.991 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-24T16:52:19.997 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../36-libsqlite3-mod-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:19.998 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.017 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr. 2026-03-24T16:52:20.023 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../37-ceph-mgr_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.023 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.053 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mon. 2026-03-24T16:52:20.059 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../38-ceph-mon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.060 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.192 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-24T16:52:20.199 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../39-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-24T16:52:20.200 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T16:52:20.221 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-osd. 2026-03-24T16:52:20.228 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../40-ceph-osd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.229 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.633 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph. 2026-03-24T16:52:20.639 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../41-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.640 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.659 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-fuse. 2026-03-24T16:52:20.666 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../42-ceph-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.667 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.701 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mds. 2026-03-24T16:52:20.708 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../43-ceph-mds_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.709 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.824 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package cephadm. 2026-03-24T16:52:20.831 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../44-cephadm_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:20.832 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.855 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-24T16:52:20.863 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../45-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-24T16:52:20.864 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T16:52:20.893 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-24T16:52:20.900 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../46-ceph-mgr-cephadm_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:20.901 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:20.929 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-24T16:52:20.936 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../47-python3-repoze.lru_0.7-2_all.deb ... 2026-03-24T16:52:20.937 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-24T16:52:20.955 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-routes. 2026-03-24T16:52:20.962 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../48-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-24T16:52:20.963 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T16:52:20.989 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-24T16:52:20.996 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../49-ceph-mgr-dashboard_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:20.997 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:22.189 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-24T16:52:22.197 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../50-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-24T16:52:22.198 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T16:52:22.289 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-joblib. 2026-03-24T16:52:22.296 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../51-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-24T16:52:22.297 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T16:52:22.338 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-24T16:52:22.341 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../52-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-24T16:52:22.342 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-24T16:52:22.366 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-sklearn. 2026-03-24T16:52:22.369 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../53-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-24T16:52:22.370 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T16:52:22.511 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-24T16:52:22.518 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../54-ceph-mgr-diskprediction-local_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:22.519 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:22.894 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-cachetools. 2026-03-24T16:52:22.899 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../55-python3-cachetools_5.0.0-1_all.deb ... 2026-03-24T16:52:22.900 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-24T16:52:22.918 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-rsa. 2026-03-24T16:52:22.923 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../56-python3-rsa_4.8-1_all.deb ... 2026-03-24T16:52:22.924 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-24T16:52:22.946 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-google-auth. 2026-03-24T16:52:22.950 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../57-python3-google-auth_1.5.1-3_all.deb ... 2026-03-24T16:52:22.951 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-24T16:52:22.974 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-24T16:52:22.980 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../58-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-24T16:52:22.981 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T16:52:23.002 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-websocket. 2026-03-24T16:52:23.008 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../59-python3-websocket_1.2.3-1_all.deb ... 2026-03-24T16:52:23.009 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-24T16:52:23.207 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-24T16:52:23.214 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../60-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-24T16:52:23.215 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T16:52:23.574 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-24T16:52:23.578 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../61-ceph-mgr-k8sevents_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:23.579 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:23.598 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-24T16:52:23.601 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../62-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-24T16:52:23.602 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T16:52:23.622 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-24T16:52:23.625 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../63-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-24T16:52:23.626 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T16:52:23.642 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package jq. 2026-03-24T16:52:23.647 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../64-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-24T16:52:23.648 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-24T16:52:23.666 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package socat. 2026-03-24T16:52:23.670 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../65-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-24T16:52:23.672 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-24T16:52:23.702 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package xmlstarlet. 2026-03-24T16:52:23.708 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../66-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-24T16:52:23.709 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-24T16:52:23.766 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-test. 2026-03-24T16:52:23.769 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../67-ceph-test_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:23.770 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:25.998 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package ceph-volume. 2026-03-24T16:52:26.003 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../68-ceph-volume_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T16:52:26.004 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:26.037 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs-daemon. 2026-03-24T16:52:26.044 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../69-libcephfs-daemon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:26.045 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:26.062 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs-proxy2. 2026-03-24T16:52:26.069 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../70-libcephfs-proxy2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:26.070 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:26.085 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-24T16:52:26.091 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../71-libcephfs-dev_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:26.093 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:26.113 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package nvme-cli. 2026-03-24T16:52:26.120 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../72-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-24T16:52:26.122 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T16:52:26.164 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-24T16:52:26.171 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../73-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-24T16:52:26.172 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T16:52:26.222 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-24T16:52:26.226 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../74-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-24T16:52:26.227 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-24T16:52:26.250 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pluggy. 2026-03-24T16:52:26.252 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../75-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-24T16:52:26.253 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-24T16:52:26.277 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-psutil. 2026-03-24T16:52:26.283 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../76-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-24T16:52:26.284 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-24T16:52:26.310 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-py. 2026-03-24T16:52:26.316 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../77-python3-py_1.10.0-1_all.deb ... 2026-03-24T16:52:26.317 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-24T16:52:26.346 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pygments. 2026-03-24T16:52:26.353 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../78-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-24T16:52:26.354 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-24T16:52:26.418 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-toml. 2026-03-24T16:52:26.423 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../79-python3-toml_0.10.2-1_all.deb ... 2026-03-24T16:52:26.424 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-24T16:52:26.446 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-pytest. 2026-03-24T16:52:26.451 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../80-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-24T16:52:26.452 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-24T16:52:26.498 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-simplejson. 2026-03-24T16:52:26.502 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../81-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-24T16:52:26.504 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-24T16:52:26.530 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-webob. 2026-03-24T16:52:26.535 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../82-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-24T16:52:26.536 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T16:52:26.558 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-24T16:52:26.565 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../83-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-24T16:52:26.565 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-24T16:52:26.722 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package radosgw. 2026-03-24T16:52:26.729 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../84-radosgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:26.730 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:27.734 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package rbd-fuse. 2026-03-24T16:52:27.737 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../85-rbd-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T16:52:27.738 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:27.758 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package smartmontools. 2026-03-24T16:52:27.762 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../86-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-24T16:52:27.770 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T16:52:27.820 INFO:teuthology.orchestra.run.vm01.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T16:52:28.088 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-24T16:52:28.089 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-24T16:52:28.505 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-24T16:52:28.573 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T16:52:28.576 INFO:teuthology.orchestra.run.vm01.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T16:52:28.645 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-24T16:52:28.897 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-24T16:52:29.340 INFO:teuthology.orchestra.run.vm01.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-24T16:52:29.363 INFO:teuthology.orchestra.run.vm01.stdout:Setting up cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:29.411 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user cephadm....done 2026-03-24T16:52:29.421 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-24T16:52:29.493 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T16:52:29.496 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-24T16:52:29.571 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-24T16:52:29.648 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T16:52:29.651 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-24T16:52:29.764 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T16:52:30.178 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-24T16:52:30.326 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-24T16:52:30.439 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:30.538 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T16:52:30.567 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-24T16:52:30.570 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T16:52:30.602 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T16:52:30.605 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-24T16:52:30.740 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-24T16:52:30.822 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:30.824 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T16:52:30.926 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-24T16:52:31.038 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-24T16:52:31.340 INFO:teuthology.orchestra.run.vm01.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-24T16:52:31.342 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T16:52:31.502 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T16:52:31.659 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T16:52:31.758 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-24T16:52:31.831 INFO:teuthology.orchestra.run.vm01.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-24T16:52:31.833 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:31.936 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T16:52:32.573 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:32.579 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-24T16:52:32.660 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T16:52:32.662 INFO:teuthology.orchestra.run.vm01.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-24T16:52:32.665 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-24T16:52:32.744 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-24T16:52:32.817 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:32.820 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-24T16:52:32.900 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-24T16:52:32.981 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-24T16:52:33.066 INFO:teuthology.orchestra.run.vm01.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T16:52:33.069 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-24T16:52:33.160 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T16:52:33.162 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T16:52:33.244 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T16:52:33.344 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-24T16:52:33.423 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T16:52:33.425 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-24T16:52:33.576 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-24T16:52:33.651 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T16:52:33.653 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-24T16:52:33.739 INFO:teuthology.orchestra.run.vm01.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-24T16:52:33.741 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-24T16:52:33.902 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T16:52:33.905 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:33.907 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:33.910 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:33.913 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T16:52:34.660 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:34.662 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:34.665 INFO:teuthology.orchestra.run.vm01.stdout:Setting up librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:34.668 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:34.670 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:34.741 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-24T16:52:34.741 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-24T16:52:35.126 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.128 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.131 INFO:teuthology.orchestra.run.vm01.stdout:Setting up libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.133 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.136 INFO:teuthology.orchestra.run.vm01.stdout:Setting up rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.139 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.141 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.144 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:35.181 INFO:teuthology.orchestra.run.vm01.stdout:Adding group ceph....done 2026-03-24T16:52:35.279 INFO:teuthology.orchestra.run.vm01.stdout:Adding system user ceph....done 2026-03-24T16:52:35.290 INFO:teuthology.orchestra.run.vm01.stdout:Setting system user ceph properties....done 2026-03-24T16:52:35.296 INFO:teuthology.orchestra.run.vm01.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-24T16:52:35.378 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-24T16:52:35.641 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-24T16:52:36.060 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:36.063 INFO:teuthology.orchestra.run.vm01.stdout:Setting up radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:36.335 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-24T16:52:36.335 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-24T16:52:36.763 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:36.854 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-24T16:52:37.256 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:37.329 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-24T16:52:37.330 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-24T16:52:37.734 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:37.817 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-24T16:52:37.817 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-24T16:52:38.216 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:38.303 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-24T16:52:38.303 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-24T16:52:38.717 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:38.719 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:38.733 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:38.800 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-24T16:52:38.800 INFO:teuthology.orchestra.run.vm01.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-24T16:52:39.171 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:39.186 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:39.188 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:39.203 INFO:teuthology.orchestra.run.vm01.stdout:Setting up ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T16:52:39.338 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T16:52:39.426 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T16:52:40.035 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.035 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-24T16:52:40.035 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.035 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-24T16:52:40.038 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart apache-htcacheclean.service 2026-03-24T16:52:40.045 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart rsyslog.service 2026-03-24T16:52:40.048 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.048 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-24T16:52:40.048 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart networkd-dispatcher.service 2026-03-24T16:52:40.048 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-24T16:52:40.048 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.049 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-24T16:52:40.049 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.049 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-24T16:52:40.049 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:40.049 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-24T16:52:41.140 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T16:52:41.143 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-jmespath python3-xmltodict s3cmd 2026-03-24T16:52:41.224 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T16:52:41.434 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T16:52:41.434 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T16:52:41.558 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T16:52:41.558 INFO:teuthology.orchestra.run.vm01.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T16:52:41.558 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T16:52:41.558 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T16:52:41.570 INFO:teuthology.orchestra.run.vm01.stdout:The following NEW packages will be installed: 2026-03-24T16:52:41.570 INFO:teuthology.orchestra.run.vm01.stdout: python3-jmespath python3-xmltodict s3cmd 2026-03-24T16:52:41.598 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 3 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T16:52:41.598 INFO:teuthology.orchestra.run.vm01.stdout:Need to get 155 kB of archives. 2026-03-24T16:52:41.598 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 678 kB of additional disk space will be used. 2026-03-24T16:52:41.598 INFO:teuthology.orchestra.run.vm01.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-24T16:52:41.615 INFO:teuthology.orchestra.run.vm01.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-24T16:52:41.617 INFO:teuthology.orchestra.run.vm01.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 s3cmd all 2.2.0-1 [120 kB] 2026-03-24T16:52:41.858 INFO:teuthology.orchestra.run.vm01.stdout:Fetched 155 kB in 0s (2526 kB/s) 2026-03-24T16:52:41.877 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-jmespath. 2026-03-24T16:52:41.911 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126082 files and directories currently installed.) 2026-03-24T16:52:41.914 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-24T16:52:41.915 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-24T16:52:41.937 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-24T16:52:41.944 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-24T16:52:41.945 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-24T16:52:41.963 INFO:teuthology.orchestra.run.vm01.stdout:Selecting previously unselected package s3cmd. 2026-03-24T16:52:41.970 INFO:teuthology.orchestra.run.vm01.stdout:Preparing to unpack .../archives/s3cmd_2.2.0-1_all.deb ... 2026-03-24T16:52:41.971 INFO:teuthology.orchestra.run.vm01.stdout:Unpacking s3cmd (2.2.0-1) ... 2026-03-24T16:52:42.010 INFO:teuthology.orchestra.run.vm01.stdout:Setting up s3cmd (2.2.0-1) ... 2026-03-24T16:52:42.108 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-24T16:52:42.182 INFO:teuthology.orchestra.run.vm01.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-24T16:52:42.264 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T16:52:42.625 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.625 INFO:teuthology.orchestra.run.vm01.stdout:Running kernel seems to be up-to-date. 2026-03-24T16:52:42.625 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.625 INFO:teuthology.orchestra.run.vm01.stdout:Services to be restarted: 2026-03-24T16:52:42.629 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart apache-htcacheclean.service 2026-03-24T16:52:42.637 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart rsyslog.service 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout:Service restarts being deferred: 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart networkd-dispatcher.service 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: systemctl restart unattended-upgrades.service 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout:No containers need to be restarted. 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout:No user sessions are running outdated binaries. 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:42.641 INFO:teuthology.orchestra.run.vm01.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-24T16:52:43.657 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T16:52:43.661 DEBUG:teuthology.parallel:result is None 2026-03-24T16:52:43.662 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T16:52:44.240 DEBUG:teuthology.orchestra.run.vm01:> dpkg-query -W -f '${Version}' ceph 2026-03-24T16:52:44.249 INFO:teuthology.orchestra.run.vm01.stdout:20.2.0-712-g70f8415b-1jammy 2026-03-24T16:52:44.256 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-712-g70f8415b-1jammy 2026-03-24T16:52:44.256 INFO:teuthology.task.install:The correct ceph version 20.2.0-712-g70f8415b-1jammy is installed. 2026-03-24T16:52:44.257 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-24T16:52:44.257 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:44.257 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-24T16:52:44.300 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-24T16:52:44.300 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:44.300 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/daemon-helper 2026-03-24T16:52:44.351 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-24T16:52:44.403 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-24T16:52:44.403 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:44.403 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-24T16:52:44.454 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-24T16:52:44.507 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-24T16:52:44.508 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:44.508 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/usr/bin/stdin-killer 2026-03-24T16:52:44.558 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-24T16:52:44.610 INFO:teuthology.run_tasks:Running task ceph... 2026-03-24T16:52:44.654 INFO:tasks.ceph:Making ceph log dir writeable by non-root... 2026-03-24T16:52:44.654 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod 777 /var/log/ceph 2026-03-24T16:52:44.663 INFO:tasks.ceph:Disabling ceph logrotate... 2026-03-24T16:52:44.663 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /etc/logrotate.d/ceph 2026-03-24T16:52:44.714 INFO:tasks.ceph:Creating extra log directories... 2026-03-24T16:52:44.714 DEBUG:teuthology.orchestra.run.vm01:> sudo install -d -m0777 -- /var/log/ceph/valgrind /var/log/ceph/profiling-logger 2026-03-24T16:52:44.770 INFO:tasks.ceph:Creating ceph cluster ceph... 2026-03-24T16:52:44.770 INFO:tasks.ceph:config {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zlib', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'fs': 'xfs', 'mkfs_options': None, 'mount_options': None, 'skip_mgr_daemons': False, 'log_ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'cpu_profile': set(), 'cluster': 'ceph', 'mon_bind_msgr2': True, 'mon_bind_addrvec': True} 2026-03-24T16:52:44.770 INFO:tasks.ceph:ctx.config {'archive_path': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621', 'branch': 'tentacle', 'description': 'rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-zlib supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic}', 'email': None, 'first_in_suite': False, 'flavor': 'default', 'job_id': '3621', 'ktype': 'distro', 'last_in_suite': False, 'machine_type': 'vps', 'name': 'kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps', 'no_nested_subset': False, 'os_type': 'ubuntu', 'os_version': '22.04', 'overrides': {'admin_socket': {'branch': 'tentacle'}, 'ansible.cephlab': {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}}, 'ceph': {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zlib', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'ceph-deploy': {'conf': {'client': {'log file': '/var/log/ceph/ceph-$name.$pid.log'}, 'global': {'osd crush chooseleaf type': 0, 'osd pool default pg num': 128, 'osd pool default pgp num': 128, 'osd pool default size': 2}, 'mon': {}}}, 'cephadm': {'cephadm_binary_url': 'https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm'}, 'install': {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}}, 'thrashosds': {'bdev_inject_crash': 2, 'bdev_inject_crash_probability': 0.5}, 'workunit': {'branch': 'tt-tentacle', 'sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4'}}, 'owner': 'kyr', 'priority': 1000, 'repo': 'https://github.com/ceph/ceph.git', 'roles': [['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0']], 'seed': 3051, 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'sleep_before_teardown': 0, 'subset': '1/128', 'suite': 'rbd', 'suite_branch': 'tt-tentacle', 'suite_path': '/home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa', 'suite_relpath': 'qa', 'suite_repo': 'https://github.com/kshtsk/ceph.git', 'suite_sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4', 'targets': {'vm01.local': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAEZ/pkw+OjvPpGlQZVoY9oPgppKVoxKt6C8jtlR5FHneXM4CmLpK6FmagGLmuIiAgExmNNElexPKds5DRmE6Yg='}, 'tasks': [{'internal.check_packages': None}, {'internal.buildpackages_prep': None}, {'internal.save_config': None}, {'internal.check_lock': None}, {'internal.add_remotes': None}, {'console_log': None}, {'internal.connect': None}, {'internal.push_inventory': None}, {'internal.serialize_remote_roles': None}, {'internal.check_conflict': None}, {'internal.check_ceph_data': None}, {'internal.vm_setup': None}, {'internal.base': None}, {'internal.archive_upload': None}, {'internal.archive': None}, {'internal.coredump': None}, {'internal.sudo': None}, {'internal.syslog': None}, {'internal.timer': None}, {'pcp': None}, {'selinux': None}, {'ansible.cephlab': None}, {'clock': None}, {'install': None}, {'ceph': None}, {'workunit': {'clients': {'client.0': ['rbd/cli_generic.sh']}}}], 'teuthology': {'fragments_dropped': [], 'meta': {}, 'postmerge': []}, 'teuthology_branch': 'clyso-debian-13', 'teuthology_repo': 'https://github.com/clyso/teuthology', 'teuthology_sha1': '1c580df7a9c7c2aadc272da296344fd99f27c444', 'timestamp': '2026-03-20_22:04:26', 'tube': 'vps', 'user': 'kyr', 'verbose': False, 'worker_log': '/home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871'} 2026-03-24T16:52:44.770 DEBUG:teuthology.orchestra.run.vm01:> install -d -m0755 -- /home/ubuntu/cephtest/ceph.data 2026-03-24T16:52:44.813 DEBUG:teuthology.orchestra.run.vm01:> sudo install -d -m0777 -- /var/run/ceph 2026-03-24T16:52:44.862 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:44.862 DEBUG:teuthology.orchestra.run.vm01:> dd if=/scratch_devs of=/dev/stdout 2026-03-24T16:52:44.909 DEBUG:teuthology.misc:devs=['/dev/vg_nvme/lv_1', '/dev/vg_nvme/lv_2', '/dev/vg_nvme/lv_3', '/dev/vg_nvme/lv_4'] 2026-03-24T16:52:44.909 DEBUG:teuthology.orchestra.run.vm01:> stat /dev/vg_nvme/lv_1 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout: File: /dev/vg_nvme/lv_1 -> ../dm-0 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout:Device: 5h/5d Inode: 792 Links: 1 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout:Access: 2026-03-24 16:51:35.559366000 +0000 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout:Modify: 2026-03-24 16:51:35.419366000 +0000 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout:Change: 2026-03-24 16:51:35.419366000 +0000 2026-03-24T16:52:44.952 INFO:teuthology.orchestra.run.vm01.stdout: Birth: - 2026-03-24T16:52:44.952 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/dev/vg_nvme/lv_1 of=/dev/null count=1 2026-03-24T16:52:45.002 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records in 2026-03-24T16:52:45.002 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records out 2026-03-24T16:52:45.002 INFO:teuthology.orchestra.run.vm01.stderr:512 bytes copied, 0.000143499 s, 3.6 MB/s 2026-03-24T16:52:45.002 DEBUG:teuthology.orchestra.run.vm01:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_1 2026-03-24T16:52:45.054 DEBUG:teuthology.orchestra.run.vm01:> stat /dev/vg_nvme/lv_2 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout: File: /dev/vg_nvme/lv_2 -> ../dm-1 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout:Device: 5h/5d Inode: 825 Links: 1 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout:Access: 2026-03-24 16:51:35.735366000 +0000 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout:Modify: 2026-03-24 16:51:35.731366000 +0000 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout:Change: 2026-03-24 16:51:35.731366000 +0000 2026-03-24T16:52:45.101 INFO:teuthology.orchestra.run.vm01.stdout: Birth: - 2026-03-24T16:52:45.101 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/dev/vg_nvme/lv_2 of=/dev/null count=1 2026-03-24T16:52:45.150 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records in 2026-03-24T16:52:45.150 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records out 2026-03-24T16:52:45.150 INFO:teuthology.orchestra.run.vm01.stderr:512 bytes copied, 0.000138449 s, 3.7 MB/s 2026-03-24T16:52:45.151 DEBUG:teuthology.orchestra.run.vm01:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_2 2026-03-24T16:52:45.198 DEBUG:teuthology.orchestra.run.vm01:> stat /dev/vg_nvme/lv_3 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout: File: /dev/vg_nvme/lv_3 -> ../dm-2 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout:Device: 5h/5d Inode: 855 Links: 1 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout:Access: 2026-03-24 16:51:36.155366000 +0000 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout:Modify: 2026-03-24 16:51:36.019366000 +0000 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout:Change: 2026-03-24 16:51:36.019366000 +0000 2026-03-24T16:52:45.245 INFO:teuthology.orchestra.run.vm01.stdout: Birth: - 2026-03-24T16:52:45.245 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/dev/vg_nvme/lv_3 of=/dev/null count=1 2026-03-24T16:52:45.297 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records in 2026-03-24T16:52:45.297 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records out 2026-03-24T16:52:45.297 INFO:teuthology.orchestra.run.vm01.stderr:512 bytes copied, 0.000161011 s, 3.2 MB/s 2026-03-24T16:52:45.298 DEBUG:teuthology.orchestra.run.vm01:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_3 2026-03-24T16:52:45.346 DEBUG:teuthology.orchestra.run.vm01:> stat /dev/vg_nvme/lv_4 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout: File: /dev/vg_nvme/lv_4 -> ../dm-3 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout:Device: 5h/5d Inode: 887 Links: 1 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout:Access: 2026-03-24 16:51:40.983366000 +0000 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout:Modify: 2026-03-24 16:51:36.335366000 +0000 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout:Change: 2026-03-24 16:51:36.335366000 +0000 2026-03-24T16:52:45.393 INFO:teuthology.orchestra.run.vm01.stdout: Birth: - 2026-03-24T16:52:45.393 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/dev/vg_nvme/lv_4 of=/dev/null count=1 2026-03-24T16:52:45.441 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records in 2026-03-24T16:52:45.441 INFO:teuthology.orchestra.run.vm01.stderr:1+0 records out 2026-03-24T16:52:45.441 INFO:teuthology.orchestra.run.vm01.stderr:512 bytes copied, 0.000141355 s, 3.6 MB/s 2026-03-24T16:52:45.442 DEBUG:teuthology.orchestra.run.vm01:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_4 2026-03-24T16:52:45.490 INFO:tasks.ceph:osd dev map: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T16:52:45.490 INFO:tasks.ceph:remote_to_roles_to_devs: {Remote(name='ubuntu@vm01.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-24T16:52:45.490 INFO:tasks.ceph:Generating config... 2026-03-24T16:52:45.490 INFO:tasks.ceph:[client] rbd default format = 1 2026-03-24T16:52:45.490 INFO:tasks.ceph:[global] mon client directed command retry = 5 2026-03-24T16:52:45.490 INFO:tasks.ceph:[global] mon warn on pool no app = False 2026-03-24T16:52:45.490 INFO:tasks.ceph:[global] ms inject socket failures = 5000 2026-03-24T16:52:45.490 INFO:tasks.ceph:[mgr] debug mgr = 20 2026-03-24T16:52:45.490 INFO:tasks.ceph:[mgr] debug ms = 1 2026-03-24T16:52:45.490 INFO:tasks.ceph:[mon] debug mon = 20 2026-03-24T16:52:45.490 INFO:tasks.ceph:[mon] debug ms = 1 2026-03-24T16:52:45.490 INFO:tasks.ceph:[mon] debug paxos = 20 2026-03-24T16:52:45.490 INFO:tasks.ceph:[osd] bluestore block size = 96636764160 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] bluestore compression algorithm = zlib 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] bluestore compression mode = aggressive 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] bluestore fsck on mount = True 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] debug bluefs = 1/20 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] debug bluestore = 1/20 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] debug ms = 1 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] debug osd = 20 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] debug rocksdb = 4/10 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] mon osd backfillfull_ratio = 0.85 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] mon osd full ratio = 0.9 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] mon osd nearfull ratio = 0.8 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] osd failsafe full ratio = 0.95 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] osd mclock iops capacity threshold hdd = 49000 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] osd objectstore = bluestore 2026-03-24T16:52:45.491 INFO:tasks.ceph:[osd] osd shutdown pgref assert = True 2026-03-24T16:52:45.491 INFO:tasks.ceph:Setting up mon.a... 2026-03-24T16:52:45.491 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring /etc/ceph/ceph.keyring 2026-03-24T16:52:45.554 INFO:teuthology.orchestra.run.vm01.stdout:creating /etc/ceph/ceph.keyring 2026-03-24T16:52:45.556 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=mon. /etc/ceph/ceph.keyring 2026-03-24T16:52:45.621 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-24T16:52:45.670 DEBUG:tasks.ceph:Ceph mon addresses: [('mon.a', '192.168.123.101')] 2026-03-24T16:52:45.670 DEBUG:tasks.ceph:writing out conf {'global': {'chdir': '', 'pid file': '/var/run/ceph/$cluster-$name.pid', 'auth supported': 'cephx', 'filestore xattr use omap': 'true', 'mon clock drift allowed': '1.000', 'osd crush chooseleaf type': '0', 'auth debug': 'true', 'ms die on old message': 'true', 'ms die on bug': 'true', 'mon max pg per osd': '10000', 'mon pg warn max object skew': '0', 'osd_pool_default_pg_autoscale_mode': 'off', 'osd pool default size': '2', 'mon osd allow primary affinity': 'true', 'mon osd allow pg remap': 'true', 'mon warn on legacy crush tunables': 'false', 'mon warn on crush straw calc version zero': 'false', 'mon warn on no sortbitwise': 'false', 'mon warn on osd down out interval zero': 'false', 'mon warn on too few osds': 'false', 'mon_warn_on_pool_pg_num_not_power_of_two': 'false', 'mon_warn_on_pool_no_redundancy': 'false', 'mon_allow_pool_size_one': 'true', 'osd pool default erasure code profile': 'plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd', 'osd default data pool replay window': '5', 'mon allow pool delete': 'true', 'mon cluster log file level': 'debug', 'debug asserts on shutdown': 'true', 'mon health detail to clog': 'false', 'mon host': '192.168.123.101', 'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'osd': {'osd journal size': '100', 'osd scrub load threshold': '5.0', 'osd scrub max interval': '600', 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': 'true', 'osd recover clone overlap': 'true', 'osd recovery max chunk': '1048576', 'osd debug shutdown': 'true', 'osd debug op order': 'true', 'osd debug verify stray on activate': 'true', 'osd debug trim objects': 'true', 'osd open classes on start': 'true', 'osd debug pg log writeout': 'true', 'osd deep scrub update digest min age': '30', 'osd map max advance': '10', 'journal zero on create': 'true', 'filestore ondisk finisher threads': '3', 'filestore apply finisher threads': '3', 'bdev debug aio': 'true', 'osd debug misdirected ops': 'true', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'zlib', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}, 'mgr': {'debug ms': 1, 'debug mgr': 20, 'debug mon': '20', 'debug auth': '20', 'mon reweight min pgs per osd': '4', 'mon reweight min bytes per osd': '10', 'mgr/telemetry/nag': 'false'}, 'mon': {'debug ms': 1, 'debug mon': 20, 'debug paxos': 20, 'debug auth': '20', 'mon data avail warn': '5', 'mon mgr mkfs grace': '240', 'mon reweight min pgs per osd': '4', 'mon osd reporter subtree level': 'osd', 'mon osd prime pg temp': 'true', 'mon reweight min bytes per osd': '10', 'auth mon ticket ttl': '660', 'auth service ticket ttl': '240', 'mon_warn_on_insecure_global_id_reclaim': 'false', 'mon_warn_on_insecure_global_id_reclaim_allowed': 'false', 'mon_down_mkfs_grace': '2m', 'mon_warn_on_filestore_osds': 'false'}, 'client': {'rgw cache enabled': 'true', 'rgw enable ops log': 'true', 'rgw enable usage log': 'true', 'log file': '/var/log/ceph/$cluster-$name.$pid.log', 'admin socket': '/var/run/ceph/$cluster-$name.$pid.asok', 'rbd default format': 1}, 'mon.a': {}} 2026-03-24T16:52:45.671 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:45.671 DEBUG:teuthology.orchestra.run.vm01:> dd of=/home/ubuntu/cephtest/ceph.tmp.conf 2026-03-24T16:52:45.717 DEBUG:teuthology.orchestra.run.vm01:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage monmaptool -c /home/ubuntu/cephtest/ceph.tmp.conf --create --clobber --enable-all-features --add a 192.168.123.101 --print /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:45.776 INFO:teuthology.orchestra.run.vm01.stdout:monmaptool: monmap file /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:45.776 INFO:teuthology.orchestra.run.vm01.stdout:monmaptool: generated fsid dc403d64-7ddd-4e06-90c8-8f9c41489fa2 2026-03-24T16:52:45.776 INFO:teuthology.orchestra.run.vm01.stdout:setting min_mon_release = tentacle 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:epoch 0 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:fsid dc403d64-7ddd-4e06-90c8-8f9c41489fa2 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:last_changed 2026-03-24T16:52:45.776247+0000 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:created 2026-03-24T16:52:45.776247+0000 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:min_mon_release 20 (tentacle) 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:election_strategy: 1 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:0: [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] mon.a 2026-03-24T16:52:45.777 INFO:teuthology.orchestra.run.vm01.stdout:monmaptool: writing epoch 0 to /home/ubuntu/cephtest/ceph.monmap (1 monitors) 2026-03-24T16:52:45.778 DEBUG:teuthology.orchestra.run.vm01:> rm -- /home/ubuntu/cephtest/ceph.tmp.conf 2026-03-24T16:52:45.825 INFO:tasks.ceph:Writing /etc/ceph/ceph.conf for FSID dc403d64-7ddd-4e06-90c8-8f9c41489fa2... 2026-03-24T16:52:45.826 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /etc/ceph && sudo chmod 0755 /etc/ceph && sudo tee /etc/ceph/ceph.conf && sudo chmod 0644 /etc/ceph/ceph.conf > /dev/null 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout:[global] 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: chdir = "" 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: pid file = /var/run/ceph/$cluster-$name.pid 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: auth supported = cephx 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: filestore xattr use omap = true 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: mon clock drift allowed = 1.000 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: osd crush chooseleaf type = 0 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: auth debug = true 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: ms die on old message = true 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: ms die on bug = true 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: mon max pg per osd = 10000 # >= luminous 2026-03-24T16:52:45.884 INFO:teuthology.orchestra.run.vm01.stdout: mon pg warn max object skew = 0 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: # disable pg_autoscaler by default for new pools 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd_pool_default_pg_autoscale_mode = off 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd pool default size = 2 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon osd allow primary affinity = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon osd allow pg remap = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on legacy crush tunables = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on crush straw calc version zero = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on no sortbitwise = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on osd down out interval zero = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on too few osds = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon_warn_on_pool_pg_num_not_power_of_two = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon_warn_on_pool_no_redundancy = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon_allow_pool_size_one = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd pool default erasure code profile = plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd default data pool replay window = 5 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon allow pool delete = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon cluster log file level = debug 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: debug asserts on shutdown = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon health detail to clog = false 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon host = 192.168.123.101 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon client directed command retry = 5 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: mon warn on pool no app = False 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: ms inject socket failures = 5000 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: fsid = dc403d64-7ddd-4e06-90c8-8f9c41489fa2 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout:[osd] 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd journal size = 100 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd scrub load threshold = 5.0 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd scrub max interval = 600 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd mclock profile = high_recovery_ops 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd mclock skip benchmark = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd recover clone overlap = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd recovery max chunk = 1048576 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd debug shutdown = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd debug op order = true 2026-03-24T16:52:45.885 INFO:teuthology.orchestra.run.vm01.stdout: osd debug verify stray on activate = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd debug trim objects = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd open classes on start = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd debug pg log writeout = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd deep scrub update digest min age = 30 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd map max advance = 10 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: journal zero on create = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: filestore ondisk finisher threads = 3 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: filestore apply finisher threads = 3 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: bdev debug aio = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd debug misdirected ops = true 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: bluestore block size = 96636764160 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: bluestore compression algorithm = zlib 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: bluestore compression mode = aggressive 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: bluestore fsck on mount = True 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug bluefs = 1/20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug bluestore = 1/20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug ms = 1 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug osd = 20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug rocksdb = 4/10 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: mon osd backfillfull_ratio = 0.85 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: mon osd full ratio = 0.9 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: mon osd nearfull ratio = 0.8 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd failsafe full ratio = 0.95 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd mclock iops capacity threshold hdd = 49000 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd objectstore = bluestore 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: osd shutdown pgref assert = True 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout:[mgr] 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug ms = 1 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug mgr = 20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug mon = 20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: debug auth = 20 2026-03-24T16:52:45.886 INFO:teuthology.orchestra.run.vm01.stdout: mon reweight min pgs per osd = 4 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon reweight min bytes per osd = 10 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mgr/telemetry/nag = false 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout:[mon] 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: debug ms = 1 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: debug mon = 20 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: debug paxos = 20 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: debug auth = 20 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon data avail warn = 5 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon mgr mkfs grace = 240 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon reweight min pgs per osd = 4 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon osd reporter subtree level = osd 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon osd prime pg temp = true 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon reweight min bytes per osd = 10 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: # rotate auth tickets quickly to exercise renewal paths 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: auth mon ticket ttl = 660 # 11m 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: auth service ticket ttl = 240 # 4m 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: # don't complain about insecure global_id in the test suite 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon_warn_on_insecure_global_id_reclaim = false 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon_warn_on_insecure_global_id_reclaim_allowed = false 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: # 1m isn't quite enough 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon_down_mkfs_grace = 2m 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: mon_warn_on_filestore_osds = false 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout:[client] 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: rgw cache enabled = true 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: rgw enable ops log = true 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: rgw enable usage log = true 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: log file = /var/log/ceph/$cluster-$name.$pid.log 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: admin socket = /var/run/ceph/$cluster-$name.$pid.asok 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout: rbd default format = 1 2026-03-24T16:52:45.887 INFO:teuthology.orchestra.run.vm01.stdout:[mon.a] 2026-03-24T16:52:45.890 INFO:tasks.ceph:Creating admin key on mon.a... 2026-03-24T16:52:45.890 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=client.admin --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *' --cap mgr 'allow *' /etc/ceph/ceph.keyring 2026-03-24T16:52:45.961 INFO:tasks.ceph:Copying monmap to all nodes... 2026-03-24T16:52:45.962 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:45.962 DEBUG:teuthology.orchestra.run.vm01:> dd if=/etc/ceph/ceph.keyring of=/dev/stdout 2026-03-24T16:52:46.006 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:46.006 DEBUG:teuthology.orchestra.run.vm01:> dd if=/home/ubuntu/cephtest/ceph.monmap of=/dev/stdout 2026-03-24T16:52:46.053 INFO:tasks.ceph:Sending monmap to node ubuntu@vm01.local 2026-03-24T16:52:46.053 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:46.053 DEBUG:teuthology.orchestra.run.vm01:> sudo dd of=/etc/ceph/ceph.keyring 2026-03-24T16:52:46.053 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-24T16:52:46.108 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:46.108 DEBUG:teuthology.orchestra.run.vm01:> dd of=/home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:46.153 INFO:tasks.ceph:Setting up mon nodes... 2026-03-24T16:52:46.153 INFO:tasks.ceph:Setting up mgr nodes... 2026-03-24T16:52:46.153 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /var/lib/ceph/mgr/ceph-x && sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=mgr.x /var/lib/ceph/mgr/ceph-x/keyring 2026-03-24T16:52:46.222 INFO:teuthology.orchestra.run.vm01.stdout:creating /var/lib/ceph/mgr/ceph-x/keyring 2026-03-24T16:52:46.225 INFO:tasks.ceph:Setting up mds nodes... 2026-03-24T16:52:46.225 INFO:tasks.ceph_client:Setting up client nodes... 2026-03-24T16:52:46.225 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=client.0 /etc/ceph/ceph.client.0.keyring && sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-24T16:52:46.288 INFO:teuthology.orchestra.run.vm01.stdout:creating /etc/ceph/ceph.client.0.keyring 2026-03-24T16:52:46.296 INFO:tasks.ceph:Running mkfs on osd nodes... 2026-03-24T16:52:46.297 INFO:tasks.ceph:ctx.disk_config.remote_to_roles_to_dev: {Remote(name='ubuntu@vm01.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-24T16:52:46.297 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /var/lib/ceph/osd/ceph-0 2026-03-24T16:52:46.347 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T16:52:46.347 INFO:tasks.ceph:role: osd.0 2026-03-24T16:52:46.347 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_1 on ubuntu@vm01.local 2026-03-24T16:52:46.347 DEBUG:teuthology.orchestra.run.vm01:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_1 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout:meta-data=/dev/vg_nvme/lv_1 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout: = sunit=0 swidth=0 blks 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T16:52:46.399 INFO:teuthology.orchestra.run.vm01.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T16:52:46.400 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T16:52:46.400 INFO:teuthology.orchestra.run.vm01.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T16:52:46.406 INFO:teuthology.orchestra.run.vm01.stdout:Discarding blocks...Done. 2026-03-24T16:52:46.407 INFO:tasks.ceph:mount /dev/vg_nvme/lv_1 on ubuntu@vm01.local -o noatime 2026-03-24T16:52:46.407 DEBUG:teuthology.orchestra.run.vm01:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_1 /var/lib/ceph/osd/ceph-0 2026-03-24T16:52:46.507 DEBUG:teuthology.orchestra.run.vm01:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-0 2026-03-24T16:52:46.514 INFO:teuthology.orchestra.run.vm01.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T16:52:46.515 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:52:46.515 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /var/lib/ceph/osd/ceph-1 2026-03-24T16:52:46.567 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T16:52:46.567 INFO:tasks.ceph:role: osd.1 2026-03-24T16:52:46.567 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_2 on ubuntu@vm01.local 2026-03-24T16:52:46.567 DEBUG:teuthology.orchestra.run.vm01:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_2 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout:meta-data=/dev/vg_nvme/lv_2 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout: = sunit=0 swidth=0 blks 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T16:52:46.619 INFO:teuthology.orchestra.run.vm01.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T16:52:46.624 INFO:teuthology.orchestra.run.vm01.stdout:Discarding blocks...Done. 2026-03-24T16:52:46.626 INFO:tasks.ceph:mount /dev/vg_nvme/lv_2 on ubuntu@vm01.local -o noatime 2026-03-24T16:52:46.626 DEBUG:teuthology.orchestra.run.vm01:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_2 /var/lib/ceph/osd/ceph-1 2026-03-24T16:52:46.686 DEBUG:teuthology.orchestra.run.vm01:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-1 2026-03-24T16:52:46.736 INFO:teuthology.orchestra.run.vm01.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T16:52:46.736 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:52:46.736 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /var/lib/ceph/osd/ceph-2 2026-03-24T16:52:46.786 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T16:52:46.786 INFO:tasks.ceph:role: osd.2 2026-03-24T16:52:46.786 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_3 on ubuntu@vm01.local 2026-03-24T16:52:46.786 DEBUG:teuthology.orchestra.run.vm01:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_3 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout:meta-data=/dev/vg_nvme/lv_3 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout: = sunit=0 swidth=0 blks 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T16:52:46.838 INFO:teuthology.orchestra.run.vm01.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T16:52:46.839 INFO:teuthology.orchestra.run.vm01.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T16:52:46.843 INFO:teuthology.orchestra.run.vm01.stdout:Discarding blocks...Done. 2026-03-24T16:52:46.844 INFO:tasks.ceph:mount /dev/vg_nvme/lv_3 on ubuntu@vm01.local -o noatime 2026-03-24T16:52:46.844 DEBUG:teuthology.orchestra.run.vm01:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_3 /var/lib/ceph/osd/ceph-2 2026-03-24T16:52:46.903 DEBUG:teuthology.orchestra.run.vm01:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-2 2026-03-24T16:52:46.953 INFO:teuthology.orchestra.run.vm01.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T16:52:46.953 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:52:46.953 DEBUG:teuthology.orchestra.run.vm01:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 0 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:47.018 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:47.013+0000 7f9956e27a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-0/keyring: can't open /var/lib/ceph/osd/ceph-0/keyring: (2) No such file or directory 2026-03-24T16:52:47.018 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:47.017+0000 7f9956e27a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-0/keyring 2026-03-24T16:52:47.018 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:47.017+0000 7f9956e27a40 -1 bdev(0x55efecf21800 /var/lib/ceph/osd/ceph-0/block) open stat got: (1) Operation not permitted 2026-03-24T16:52:47.018 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:47.017+0000 7f9956e27a40 -1 bluestore(/var/lib/ceph/osd/ceph-0) _read_fsid unparsable uuid 2026-03-24T16:52:48.137 DEBUG:teuthology.orchestra.run.vm01:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-24T16:52:48.187 DEBUG:teuthology.orchestra.run.vm01:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 1 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:48.254 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:48.253+0000 7f089815ba40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-1/keyring: can't open /var/lib/ceph/osd/ceph-1/keyring: (2) No such file or directory 2026-03-24T16:52:48.255 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:48.253+0000 7f089815ba40 -1 created new key in keyring /var/lib/ceph/osd/ceph-1/keyring 2026-03-24T16:52:48.255 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:48.253+0000 7f089815ba40 -1 bdev(0x560021985800 /var/lib/ceph/osd/ceph-1/block) open stat got: (1) Operation not permitted 2026-03-24T16:52:48.255 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:48.253+0000 7f089815ba40 -1 bluestore(/var/lib/ceph/osd/ceph-1) _read_fsid unparsable uuid 2026-03-24T16:52:49.310 DEBUG:teuthology.orchestra.run.vm01:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-24T16:52:49.364 DEBUG:teuthology.orchestra.run.vm01:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 2 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:49.432 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:49.429+0000 7fc2690e1a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-2/keyring: can't open /var/lib/ceph/osd/ceph-2/keyring: (2) No such file or directory 2026-03-24T16:52:49.432 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:49.429+0000 7fc2690e1a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-2/keyring 2026-03-24T16:52:49.432 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:49.429+0000 7fc2690e1a40 -1 bdev(0x56102e5cd800 /var/lib/ceph/osd/ceph-2/block) open stat got: (1) Operation not permitted 2026-03-24T16:52:49.432 INFO:teuthology.orchestra.run.vm01.stderr:2026-03-24T16:52:49.429+0000 7fc2690e1a40 -1 bluestore(/var/lib/ceph/osd/ceph-2) _read_fsid unparsable uuid 2026-03-24T16:52:50.358 DEBUG:teuthology.orchestra.run.vm01:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-24T16:52:50.410 INFO:tasks.ceph:Reading keys from all nodes... 2026-03-24T16:52:50.410 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:50.410 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/mgr/ceph-x/keyring of=/dev/stdout 2026-03-24T16:52:50.463 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:50.463 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-0/keyring of=/dev/stdout 2026-03-24T16:52:50.514 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:50.514 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-1/keyring of=/dev/stdout 2026-03-24T16:52:50.566 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:50.566 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-2/keyring of=/dev/stdout 2026-03-24T16:52:50.618 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:50.618 DEBUG:teuthology.orchestra.run.vm01:> dd if=/etc/ceph/ceph.client.0.keyring of=/dev/stdout 2026-03-24T16:52:50.665 INFO:tasks.ceph:Adding keys to all mons... 2026-03-24T16:52:50.665 DEBUG:teuthology.orchestra.run.vm01:> sudo tee -a /etc/ceph/ceph.keyring 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout:[mgr.x] 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout: key = AQBewcJpi2c6DRAAGmdc0/daLAaEKda5+9OYTQ== 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout:[osd.0] 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout: key = AQBfwcJpmsIMARAAb/2AsU4xQjuxSwoWo/ioeQ== 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout:[osd.1] 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout: key = AQBgwcJp5VQmDxAA60KG51qsQ89Vj3+gOW7nDQ== 2026-03-24T16:52:50.713 INFO:teuthology.orchestra.run.vm01.stdout:[osd.2] 2026-03-24T16:52:50.714 INFO:teuthology.orchestra.run.vm01.stdout: key = AQBhwcJpnXK9GRAAkXLQkv58Ijin4rrw7z4qMg== 2026-03-24T16:52:50.714 INFO:teuthology.orchestra.run.vm01.stdout:[client.0] 2026-03-24T16:52:50.714 INFO:teuthology.orchestra.run.vm01.stdout: key = AQBewcJprr4mERAALynb2j5GEuA96g0v+VSn2Q== 2026-03-24T16:52:50.715 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=mgr.x --cap mon 'allow profile mgr' --cap osd 'allow *' --cap mds 'allow *' 2026-03-24T16:52:50.783 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.0 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T16:52:50.855 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.1 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T16:52:50.925 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.2 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T16:52:50.991 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=client.0 --cap mon 'allow rw' --cap mgr 'allow r' --cap osd 'allow rwx' --cap mds allow 2026-03-24T16:52:51.067 INFO:tasks.ceph:Running mkfs on mon nodes... 2026-03-24T16:52:51.067 DEBUG:teuthology.orchestra.run.vm01:> sudo mkdir -p /var/lib/ceph/mon/ceph-a 2026-03-24T16:52:51.119 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i a --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-24T16:52:51.203 DEBUG:teuthology.orchestra.run.vm01:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-a 2026-03-24T16:52:51.255 DEBUG:teuthology.orchestra.run.vm01:> rm -- /home/ubuntu/cephtest/ceph.monmap 2026-03-24T16:52:51.301 INFO:tasks.ceph:Starting mon daemons in cluster ceph... 2026-03-24T16:52:51.301 INFO:tasks.ceph.mon.a:Restarting daemon 2026-03-24T16:52:51.301 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i a 2026-03-24T16:52:51.343 INFO:tasks.ceph.mon.a:Started 2026-03-24T16:52:51.343 INFO:tasks.ceph:Starting mgr daemons in cluster ceph... 2026-03-24T16:52:51.343 INFO:tasks.ceph.mgr.x:Restarting daemon 2026-03-24T16:52:51.343 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mgr -f --cluster ceph -i x 2026-03-24T16:52:51.344 INFO:tasks.ceph.mgr.x:Started 2026-03-24T16:52:51.344 DEBUG:tasks.ceph:set 0 configs 2026-03-24T16:52:51.344 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph config dump 2026-03-24T16:52:51.455 INFO:teuthology.orchestra.run.vm01.stdout:WHO MASK LEVEL OPTION VALUE RO 2026-03-24T16:52:51.468 INFO:tasks.ceph:Setting crush tunables to default 2026-03-24T16:52:51.468 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph osd crush tunables default 2026-03-24T16:52:51.577 INFO:teuthology.orchestra.run.vm01.stderr:adjusted tunables profile to default 2026-03-24T16:52:51.596 INFO:tasks.ceph:check_enable_crimson: False 2026-03-24T16:52:51.596 INFO:tasks.ceph:Starting osd daemons in cluster ceph... 2026-03-24T16:52:51.596 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:51.596 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-0/fsid of=/dev/stdout 2026-03-24T16:52:51.607 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:51.607 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-1/fsid of=/dev/stdout 2026-03-24T16:52:51.663 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:52:51.663 DEBUG:teuthology.orchestra.run.vm01:> sudo dd if=/var/lib/ceph/osd/ceph-2/fsid of=/dev/stdout 2026-03-24T16:52:51.716 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph osd new 60e193c7-684f-428f-8239-e42abf940efd 0 2026-03-24T16:52:51.881 INFO:teuthology.orchestra.run.vm01.stdout:0 2026-03-24T16:52:51.894 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph osd new 3fdd8338-b04d-4ff4-a2f5-1de82f2b325c 1 2026-03-24T16:52:52.015 INFO:teuthology.orchestra.run.vm01.stdout:1 2026-03-24T16:52:52.028 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph osd new 10385bfd-c8a3-402e-825a-5468ed42a5f9 2 2026-03-24T16:52:52.147 INFO:teuthology.orchestra.run.vm01.stdout:2 2026-03-24T16:52:52.160 INFO:tasks.ceph.osd.0:Restarting daemon 2026-03-24T16:52:52.160 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0 2026-03-24T16:52:52.161 INFO:tasks.ceph.osd.0:Started 2026-03-24T16:52:52.161 INFO:tasks.ceph.osd.1:Restarting daemon 2026-03-24T16:52:52.161 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 1 2026-03-24T16:52:52.162 INFO:tasks.ceph.osd.1:Started 2026-03-24T16:52:52.162 INFO:tasks.ceph.osd.2:Restarting daemon 2026-03-24T16:52:52.162 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 2 2026-03-24T16:52:52.163 INFO:tasks.ceph.osd.2:Started 2026-03-24T16:52:52.163 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T16:52:52.332 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:52:52.332 INFO:teuthology.orchestra.run.vm01.stdout:{"epoch":5,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T16:52:52.145302+0000","last_up_change":"0.000000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T16:52:52.347 INFO:tasks.ceph.ceph_manager.ceph:[] 2026-03-24T16:52:52.347 INFO:tasks.ceph:Waiting for OSDs to come up 2026-03-24T16:52:52.503 INFO:tasks.ceph.osd.1.vm01.stderr:2026-03-24T16:52:52.501+0000 7fb5dfa4fa40 -1 Falling back to public interface 2026-03-24T16:52:52.527 INFO:tasks.ceph.osd.2.vm01.stderr:2026-03-24T16:52:52.525+0000 7f3a8b611a40 -1 Falling back to public interface 2026-03-24T16:52:52.551 INFO:tasks.ceph.osd.0.vm01.stderr:2026-03-24T16:52:52.549+0000 7f5af31bba40 -1 Falling back to public interface 2026-03-24T16:52:52.649 DEBUG:teuthology.orchestra.run.vm01:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-24T16:52:52.988 INFO:teuthology.misc.health.vm01.stderr:2026-03-24T16:52:52.985+0000 7f532c8c9640 0 --2- 192.168.123.101:0/1755679748 >> v2:192.168.123.101:3300/0 conn(0x7f5328153fd0 0x7f53281743b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T16:52:53.113 INFO:teuthology.misc.health.vm01.stdout: 2026-03-24T16:52:53.114 INFO:teuthology.misc.health.vm01.stdout:{"epoch":5,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T16:52:52.145302+0000","last_up_change":"0.000000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T16:52:53.165 INFO:tasks.ceph.mgr.x.vm01.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-24T16:52:53.165 INFO:tasks.ceph.mgr.x.vm01.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-24T16:52:53.165 INFO:tasks.ceph.mgr.x.vm01.stderr: from numpy import show_config as show_numpy_config 2026-03-24T16:52:53.170 DEBUG:teuthology.misc:0 of 3 OSDs are up 2026-03-24T16:52:53.785 INFO:tasks.ceph.osd.1.vm01.stderr:2026-03-24T16:52:53.781+0000 7fb5dfa4fa40 -1 osd.1 0 log_to_monitors true 2026-03-24T16:52:53.924 INFO:tasks.ceph.osd.0.vm01.stderr:2026-03-24T16:52:53.921+0000 7f5af31bba40 -1 osd.0 0 log_to_monitors true 2026-03-24T16:52:53.963 INFO:tasks.ceph.osd.2.vm01.stderr:2026-03-24T16:52:53.961+0000 7f3a8b611a40 -1 osd.2 0 log_to_monitors true 2026-03-24T16:52:54.286 INFO:tasks.ceph.mgr.x.vm01.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-24T16:52:55.666 INFO:tasks.ceph.osd.1.vm01.stderr:2026-03-24T16:52:55.665+0000 7fb5db9f8640 -1 osd.1 0 waiting for initial osdmap 2026-03-24T16:52:55.666 INFO:tasks.ceph.osd.0.vm01.stderr:2026-03-24T16:52:55.665+0000 7f5aef976640 -1 osd.0 0 waiting for initial osdmap 2026-03-24T16:52:55.669 INFO:tasks.ceph.osd.0.vm01.stderr:2026-03-24T16:52:55.665+0000 7f5ae9f72640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T16:52:55.670 INFO:tasks.ceph.osd.1.vm01.stderr:2026-03-24T16:52:55.665+0000 7fb5d6806640 -1 osd.1 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T16:52:55.674 INFO:tasks.ceph.osd.2.vm01.stderr:2026-03-24T16:52:55.673+0000 7f3a875ba640 -1 osd.2 0 waiting for initial osdmap 2026-03-24T16:52:55.677 INFO:tasks.ceph.osd.2.vm01.stderr:2026-03-24T16:52:55.673+0000 7f3a823c8640 -1 osd.2 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T16:52:55.991 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T16:52:55.989+0000 7f1be77a5640 -1 mgr.server handle_report got status from non-daemon mon.a 2026-03-24T16:52:59.472 DEBUG:teuthology.orchestra.run.vm01:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-24T16:52:59.649 INFO:teuthology.misc.health.vm01.stdout: 2026-03-24T16:52:59.649 INFO:teuthology.misc.health.vm01.stdout:{"epoch":10,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T16:52:58.794409+0000","last_up_change":"2026-03-24T16:52:56.653061+0000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T16:52:56.800817+0000","flags":32769,"flags_names":"hashpspool,creating","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"10","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6816","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6817","nonce":1022245844}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6818","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6819","nonce":1022245844}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6822","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6823","nonce":1022245844}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6820","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6821","nonce":1022245844}]},"public_addr":"192.168.123.101:6817/1022245844","cluster_addr":"192.168.123.101:6819/1022245844","heartbeat_back_addr":"192.168.123.101:6823/1022245844","heartbeat_front_addr":"192.168.123.101:6821/1022245844","state":["exists","up"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":9,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6800","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6801","nonce":4203820349}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6802","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6803","nonce":4203820349}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6806","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6807","nonce":4203820349}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6804","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6805","nonce":4203820349}]},"public_addr":"192.168.123.101:6801/4203820349","cluster_addr":"192.168.123.101:6803/4203820349","heartbeat_back_addr":"192.168.123.101:6807/4203820349","heartbeat_front_addr":"192.168.123.101:6805/4203820349","state":["exists","up"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6808","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6809","nonce":831386055}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6810","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6811","nonce":831386055}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6814","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6815","nonce":831386055}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6812","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6813","nonce":831386055}]},"public_addr":"192.168.123.101:6809/831386055","cluster_addr":"192.168.123.101:6811/831386055","heartbeat_back_addr":"192.168.123.101:6815/831386055","heartbeat_front_addr":"192.168.123.101:6813/831386055","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T16:52:59.662 DEBUG:teuthology.misc:3 of 3 OSDs are up 2026-03-24T16:52:59.662 INFO:tasks.ceph:Creating RBD pool 2026-03-24T16:52:59.662 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph osd pool create rbd 8 2026-03-24T16:53:00.815 INFO:teuthology.orchestra.run.vm01.stderr:pool 'rbd' created 2026-03-24T16:53:00.838 DEBUG:teuthology.orchestra.run.vm01:> rbd --cluster ceph pool init rbd 2026-03-24T16:53:03.838 INFO:tasks.ceph:Starting mds daemons in cluster ceph... 2026-03-24T16:53:03.838 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph config log 1 --format=json 2026-03-24T16:53:03.838 INFO:tasks.daemonwatchdog.daemon_watchdog:watchdog starting 2026-03-24T16:53:04.006 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:04.021 INFO:teuthology.orchestra.run.vm01.stdout:[{"version":1,"timestamp":"0.000000","name":"","changes":[]}] 2026-03-24T16:53:04.021 INFO:tasks.ceph_manager:config epoch is 1 2026-03-24T16:53:04.021 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-24T16:53:04.021 INFO:tasks.ceph.ceph_manager.ceph:waiting for mgr available 2026-03-24T16:53:04.021 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph mgr dump --format=json 2026-03-24T16:53:04.217 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:04.235 INFO:teuthology.orchestra.run.vm01.stdout:{"epoch":5,"flags":0,"active_gid":4105,"active_name":"x","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6824","nonce":189105572},{"type":"v1","addr":"192.168.123.101:6825","nonce":189105572}]},"active_addr":"192.168.123.101:6825/189105572","active_change":"2026-03-24T16:52:54.779662+0000","active_mgr_features":4544132024016699391,"available":true,"standbys":[],"modules":["iostat","nfs"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to, use commas to separate multiple","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","read","upmap","upmap-read"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_pg_upmap_activity":{"name":"update_pg_upmap_activity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Updates pg_upmap activity stats to be used in `balancer status detail`","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cephadm_log_destination":{"name":"cephadm_log_destination","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":["file","file,syslog","syslog"],"desc":"Destination for cephadm command's persistent logging","long_desc":"","tags":[],"see_also":[]},"certificate_automated_rotation_enabled":{"name":"certificate_automated_rotation_enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"This flag controls whether cephadm automatically rotates certificates upon expiration.","long_desc":"","tags":[],"see_also":[]},"certificate_check_debug_mode":{"name":"certificate_check_debug_mode","type":"bool","level":"dev","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.","long_desc":"","tags":[],"see_also":[]},"certificate_check_period":{"name":"certificate_check_period","type":"int","level":"advanced","flags":0,"default_value":"1","min":"0","max":"30","enum_allowed":[],"desc":"Specifies how often (in days) the certificate should be checked for validity.","long_desc":"","tags":[],"see_also":[]},"certificate_duration_days":{"name":"certificate_duration_days","type":"int","level":"advanced","flags":0,"default_value":"1095","min":"90","max":"3650","enum_allowed":[],"desc":"Specifies the duration of self certificates generated and signed by cephadm root CA","long_desc":"","tags":[],"see_also":[]},"certificate_renewal_threshold_days":{"name":"certificate_renewal_threshold_days","type":"int","level":"advanced","flags":0,"default_value":"30","min":"10","max":"90","enum_allowed":[],"desc":"Specifies the lead time in days to initiate certificate renewal before expiration.","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.28.1","min":"","max":"","enum_allowed":[],"desc":"Alertmanager container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"Elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/grafana:12.3.1","min":"","max":"","enum_allowed":[],"desc":"Grafana container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"Haproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_nginx":{"name":"container_image_nginx","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nginx:sclorg-nginx-126","min":"","max":"","enum_allowed":[],"desc":"Nginx container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.9.1","min":"","max":"","enum_allowed":[],"desc":"Node exporter container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.5","min":"","max":"","enum_allowed":[],"desc":"Nvmeof container image","long_desc":"","tags":[],"see_also":[]},"container_image_oauth2_proxy":{"name":"container_image_oauth2_proxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/oauth2-proxy/oauth2-proxy:v7.6.0","min":"","max":"","enum_allowed":[],"desc":"Oauth2 proxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v3.6.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba":{"name":"container_image_samba","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-server:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba_metrics":{"name":"container_image_samba_metrics","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-metrics:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba metrics container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"Snmp gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"grafana_dashboards_path":{"name":"grafana_dashboards_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/grafana/dashboards/ceph-dashboard/","min":"","max":"","enum_allowed":[],"desc":"location of dashboards to include in grafana deployments","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_count_max":{"name":"ssh_keepalive_count_max","type":"int","level":"advanced","flags":0,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"How many times ssh connections can fail liveness checks before the host is marked offline","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_interval":{"name":"ssh_keepalive_interval","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"How often ssh connections are checked for liveness","long_desc":"","tags":[],"see_also":[]},"stray_daemon_check_interval":{"name":"stray_daemon_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"how frequently cephadm should check for the presence of stray daemons","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MANAGED_BY_CLUSTERS":{"name":"MANAGED_BY_CLUSTERS","type":"str","level":"advanced","flags":0,"default_value":"[]","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MULTICLUSTER_CONFIG":{"name":"MULTICLUSTER_CONFIG","type":"str","level":"advanced","flags":0,"default_value":"{}","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROM_ALERT_CREDENTIAL_CACHE_TTL":{"name":"PROM_ALERT_CREDENTIAL_CACHE_TTL","type":"int","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_HOSTNAME_PER_DAEMON":{"name":"RGW_HOSTNAME_PER_DAEMON","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crypto_caller":{"name":"crypto_caller","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sso_oauth2":{"name":"sso_oauth2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"squid":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"tentacle":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":0,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.101:0","nonce":1770320242}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.101:0","nonce":2654874061}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.101:0","nonce":1914320998}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.101:0","nonce":213316915}]}]} 2026-03-24T16:53:04.235 INFO:tasks.ceph.ceph_manager.ceph:mgr available! 2026-03-24T16:53:04.235 INFO:tasks.ceph.ceph_manager.ceph:waiting for all up 2026-03-24T16:53:04.235 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T16:53:04.412 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:04.412 INFO:teuthology.orchestra.run.vm01.stdout:{"epoch":15,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T16:53:03.824710+0000","last_up_change":"2026-03-24T16:52:56.653061+0000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T16:52:56.800817+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T16:52:59.857305+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6816","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6817","nonce":1022245844}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6818","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6819","nonce":1022245844}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6822","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6823","nonce":1022245844}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6820","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6821","nonce":1022245844}]},"public_addr":"192.168.123.101:6817/1022245844","cluster_addr":"192.168.123.101:6819/1022245844","heartbeat_back_addr":"192.168.123.101:6823/1022245844","heartbeat_front_addr":"192.168.123.101:6821/1022245844","state":["exists","up"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6800","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6801","nonce":4203820349}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6802","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6803","nonce":4203820349}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6806","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6807","nonce":4203820349}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6804","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6805","nonce":4203820349}]},"public_addr":"192.168.123.101:6801/4203820349","cluster_addr":"192.168.123.101:6803/4203820349","heartbeat_back_addr":"192.168.123.101:6807/4203820349","heartbeat_front_addr":"192.168.123.101:6805/4203820349","state":["exists","up"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6808","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6809","nonce":831386055}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6810","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6811","nonce":831386055}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6814","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6815","nonce":831386055}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6812","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6813","nonce":831386055}]},"public_addr":"192.168.123.101:6809/831386055","cluster_addr":"192.168.123.101:6811/831386055","heartbeat_back_addr":"192.168.123.101:6815/831386055","heartbeat_front_addr":"192.168.123.101:6813/831386055","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T16:53:04.426 INFO:tasks.ceph.ceph_manager.ceph:all up! 2026-03-24T16:53:04.427 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T16:53:04.594 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:04.594 INFO:teuthology.orchestra.run.vm01.stdout:{"epoch":15,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T16:53:03.824710+0000","last_up_change":"2026-03-24T16:52:56.653061+0000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T16:52:56.800817+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T16:52:59.857305+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6816","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6817","nonce":1022245844}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6818","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6819","nonce":1022245844}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6822","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6823","nonce":1022245844}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6820","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6821","nonce":1022245844}]},"public_addr":"192.168.123.101:6817/1022245844","cluster_addr":"192.168.123.101:6819/1022245844","heartbeat_back_addr":"192.168.123.101:6823/1022245844","heartbeat_front_addr":"192.168.123.101:6821/1022245844","state":["exists","up"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6800","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6801","nonce":4203820349}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6802","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6803","nonce":4203820349}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6806","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6807","nonce":4203820349}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6804","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6805","nonce":4203820349}]},"public_addr":"192.168.123.101:6801/4203820349","cluster_addr":"192.168.123.101:6803/4203820349","heartbeat_back_addr":"192.168.123.101:6807/4203820349","heartbeat_front_addr":"192.168.123.101:6805/4203820349","state":["exists","up"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6808","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6809","nonce":831386055}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6810","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6811","nonce":831386055}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6814","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6815","nonce":831386055}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6812","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6813","nonce":831386055}]},"public_addr":"192.168.123.101:6809/831386055","cluster_addr":"192.168.123.101:6811/831386055","heartbeat_back_addr":"192.168.123.101:6815/831386055","heartbeat_front_addr":"192.168.123.101:6813/831386055","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T16:53:04.608 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 flush_pg_stats 2026-03-24T16:53:04.608 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 flush_pg_stats 2026-03-24T16:53:04.608 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 flush_pg_stats 2026-03-24T16:53:04.710 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.710 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-24T16:53:04.723 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.724 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-24T16:53:04.727 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.727 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-24T16:53:04.897 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.912 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.0 2026-03-24T16:53:04.912 DEBUG:teuthology.parallel:result is None 2026-03-24T16:53:04.949 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.963 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.1 2026-03-24T16:53:04.963 DEBUG:teuthology.parallel:result is None 2026-03-24T16:53:04.972 INFO:teuthology.orchestra.run.vm01.stdout:34359738371 2026-03-24T16:53:04.986 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.2 2026-03-24T16:53:04.986 DEBUG:teuthology.parallel:result is None 2026-03-24T16:53:04.986 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-24T16:53:04.986 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T16:53:05.213 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:05.214 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T16:53:05.228 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-24T16:53:04.786143+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":13,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81384,"kb_used_data":848,"kb_used_omap":24,"kb_used_meta":80423,"kb_avail":283034136,"statfs":{"total":289910292480,"available":289826955264,"internally_reserved":0,"allocated":868352,"data_stored":1025927,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":25013,"internal_metadata":82353739},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.969270"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831237+0000","last_change":"2026-03-24T16:53:03.831340+0000","last_active":"2026-03-24T16:53:03.831237+0000","last_peered":"2026-03-24T16:53:03.831237+0000","last_clean":"2026-03-24T16:53:03.831237+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T16:53:03.831237+0000","last_undegraded":"2026-03-24T16:53:03.831237+0000","last_fullsized":"2026-03-24T16:53:03.831237+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T22:35:11.514995+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037031199999999997,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831251+0000","last_change":"2026-03-24T16:53:03.831334+0000","last_active":"2026-03-24T16:53:03.831251+0000","last_peered":"2026-03-24T16:53:03.831251+0000","last_clean":"2026-03-24T16:53:03.831251+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T16:53:03.831251+0000","last_undegraded":"2026-03-24T16:53:03.831251+0000","last_fullsized":"2026-03-24T16:53:03.831251+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:12:34.665045+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033228199999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831381+0000","last_change":"2026-03-24T16:53:03.831438+0000","last_active":"2026-03-24T16:53:03.831381+0000","last_peered":"2026-03-24T16:53:03.831381+0000","last_clean":"2026-03-24T16:53:03.831381+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T16:53:03.831381+0000","last_undegraded":"2026-03-24T16:53:03.831381+0000","last_fullsized":"2026-03-24T16:53:03.831381+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:34:15.325256+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00036637600000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831409+0000","last_change":"2026-03-24T16:53:03.831463+0000","last_active":"2026-03-24T16:53:03.831409+0000","last_peered":"2026-03-24T16:53:03.831409+0000","last_clean":"2026-03-24T16:53:03.831409+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T16:53:03.831409+0000","last_undegraded":"2026-03-24T16:53:03.831409+0000","last_fullsized":"2026-03-24T16:53:03.831409+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:31:33.351783+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00036142599999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.832900+0000","last_change":"2026-03-24T16:53:03.832900+0000","last_active":"2026-03-24T16:53:03.832900+0000","last_peered":"2026-03-24T16:53:03.832900+0000","last_clean":"2026-03-24T16:53:03.832900+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T16:53:03.832900+0000","last_undegraded":"2026-03-24T16:53:03.832900+0000","last_fullsized":"2026-03-24T16:53:03.832900+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:09:41.046703+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00063821400000000001,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":18,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.879057+0000","last_change":"2026-03-24T16:53:03.879286+0000","last_active":"2026-03-24T16:53:03.879057+0000","last_peered":"2026-03-24T16:53:03.879057+0000","last_clean":"2026-03-24T16:53:03.879057+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T16:53:03.879057+0000","last_undegraded":"2026-03-24T16:53:03.879057+0000","last_fullsized":"2026-03-24T16:53:03.879057+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T04:34:11.146151+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000414146,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":18,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.879073+0000","last_change":"2026-03-24T16:53:03.879157+0000","last_active":"2026-03-24T16:53:03.879073+0000","last_peered":"2026-03-24T16:53:03.879073+0000","last_clean":"2026-03-24T16:53:03.879073+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T16:53:03.879073+0000","last_undegraded":"2026-03-24T16:53:03.879073+0000","last_fullsized":"2026-03-24T16:53:03.879073+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:54:32.393056+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000199784,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831483+0000","last_change":"2026-03-24T16:53:03.831539+0000","last_active":"2026-03-24T16:53:03.831483+0000","last_peered":"2026-03-24T16:53:03.831483+0000","last_clean":"2026-03-24T16:53:03.831483+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T16:53:03.831483+0000","last_undegraded":"2026-03-24T16:53:03.831483+0000","last_fullsized":"2026-03-24T16:53:03.831483+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:47:26.159901+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039536900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831167+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T16:53:03.831167+0000","last_peered":"2026-03-24T16:53:03.831167+0000","last_clean":"2026-03-24T16:53:03.831167+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T16:53:03.831167+0000","last_undegraded":"2026-03-24T16:53:03.831167+0000","last_fullsized":"2026-03-24T16:53:03.831167+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26960,"kb_used_data":112,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344880,"statfs":{"total":96636764160,"available":96609157120,"internally_reserved":0,"allocated":114688,"data_stored":31091,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7471,"internal_metadata":27452113},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-24T16:53:05.228 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T16:53:05.403 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:05.403 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T16:53:05.418 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-24T16:53:04.786143+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":13,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81384,"kb_used_data":848,"kb_used_omap":24,"kb_used_meta":80423,"kb_avail":283034136,"statfs":{"total":289910292480,"available":289826955264,"internally_reserved":0,"allocated":868352,"data_stored":1025927,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":25013,"internal_metadata":82353739},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.969270"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831237+0000","last_change":"2026-03-24T16:53:03.831340+0000","last_active":"2026-03-24T16:53:03.831237+0000","last_peered":"2026-03-24T16:53:03.831237+0000","last_clean":"2026-03-24T16:53:03.831237+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T16:53:03.831237+0000","last_undegraded":"2026-03-24T16:53:03.831237+0000","last_fullsized":"2026-03-24T16:53:03.831237+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T22:35:11.514995+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037031199999999997,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831251+0000","last_change":"2026-03-24T16:53:03.831334+0000","last_active":"2026-03-24T16:53:03.831251+0000","last_peered":"2026-03-24T16:53:03.831251+0000","last_clean":"2026-03-24T16:53:03.831251+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T16:53:03.831251+0000","last_undegraded":"2026-03-24T16:53:03.831251+0000","last_fullsized":"2026-03-24T16:53:03.831251+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:12:34.665045+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033228199999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831381+0000","last_change":"2026-03-24T16:53:03.831438+0000","last_active":"2026-03-24T16:53:03.831381+0000","last_peered":"2026-03-24T16:53:03.831381+0000","last_clean":"2026-03-24T16:53:03.831381+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T16:53:03.831381+0000","last_undegraded":"2026-03-24T16:53:03.831381+0000","last_fullsized":"2026-03-24T16:53:03.831381+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:34:15.325256+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00036637600000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831409+0000","last_change":"2026-03-24T16:53:03.831463+0000","last_active":"2026-03-24T16:53:03.831409+0000","last_peered":"2026-03-24T16:53:03.831409+0000","last_clean":"2026-03-24T16:53:03.831409+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T16:53:03.831409+0000","last_undegraded":"2026-03-24T16:53:03.831409+0000","last_fullsized":"2026-03-24T16:53:03.831409+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:31:33.351783+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00036142599999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.832900+0000","last_change":"2026-03-24T16:53:03.832900+0000","last_active":"2026-03-24T16:53:03.832900+0000","last_peered":"2026-03-24T16:53:03.832900+0000","last_clean":"2026-03-24T16:53:03.832900+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T16:53:03.832900+0000","last_undegraded":"2026-03-24T16:53:03.832900+0000","last_fullsized":"2026-03-24T16:53:03.832900+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:09:41.046703+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00063821400000000001,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":18,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.879057+0000","last_change":"2026-03-24T16:53:03.879286+0000","last_active":"2026-03-24T16:53:03.879057+0000","last_peered":"2026-03-24T16:53:03.879057+0000","last_clean":"2026-03-24T16:53:03.879057+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T16:53:03.879057+0000","last_undegraded":"2026-03-24T16:53:03.879057+0000","last_fullsized":"2026-03-24T16:53:03.879057+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T04:34:11.146151+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000414146,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":18,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.879073+0000","last_change":"2026-03-24T16:53:03.879157+0000","last_active":"2026-03-24T16:53:03.879073+0000","last_peered":"2026-03-24T16:53:03.879073+0000","last_clean":"2026-03-24T16:53:03.879073+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T16:53:03.879073+0000","last_undegraded":"2026-03-24T16:53:03.879073+0000","last_fullsized":"2026-03-24T16:53:03.879073+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:54:32.393056+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000199784,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831483+0000","last_change":"2026-03-24T16:53:03.831539+0000","last_active":"2026-03-24T16:53:03.831483+0000","last_peered":"2026-03-24T16:53:03.831483+0000","last_clean":"2026-03-24T16:53:03.831483+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T16:53:03.831483+0000","last_undegraded":"2026-03-24T16:53:03.831483+0000","last_fullsized":"2026-03-24T16:53:03.831483+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:47:26.159901+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039536900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T16:53:03.831167+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T16:53:03.831167+0000","last_peered":"2026-03-24T16:53:03.831167+0000","last_clean":"2026-03-24T16:53:03.831167+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T16:53:03.831167+0000","last_undegraded":"2026-03-24T16:53:03.831167+0000","last_fullsized":"2026-03-24T16:53:03.831167+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26960,"kb_used_data":112,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344880,"statfs":{"total":96636764160,"available":96609157120,"internally_reserved":0,"allocated":114688,"data_stored":31091,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7471,"internal_metadata":27452113},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27216,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344624,"statfs":{"total":96636764160,"available":96608894976,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-24T16:53:05.418 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-24T16:53:05.418 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-24T16:53:05.418 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy 2026-03-24T16:53:05.418 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph health --format=json 2026-03-24T16:53:05.613 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T16:53:05.613 INFO:teuthology.orchestra.run.vm01.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-24T16:53:05.631 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy done 2026-03-24T16:53:05.631 INFO:teuthology.run_tasks:Running task workunit... 2026-03-24T16:53:05.635 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-24T16:53:05.635 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-24T16:53:05.635 DEBUG:teuthology.orchestra.run.vm01:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-24T16:53:05.639 INFO:teuthology.orchestra.run.vm01.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-24T16:53:05.640 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T16:53:05.640 DEBUG:teuthology.orchestra.run.vm01:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-24T16:53:05.685 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-24T16:53:05.685 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-24T16:53:05.729 INFO:tasks.workunit:timeout=3h 2026-03-24T16:53:05.729 INFO:tasks.workunit:cleanup=True 2026-03-24T16:53:05.729 DEBUG:teuthology.orchestra.run.vm01:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-24T16:53:05.774 INFO:tasks.workunit.client.0.vm01.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-24T16:53:49.569 INFO:tasks.workunit.client.0.vm01.stderr:Updating files: 99% (13867/13988) Updating files: 100% (13988/13988) Updating files: 100% (13988/13988), done. 2026-03-24T16:53:50.322 INFO:tasks.workunit.client.0.vm01.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-24T16:53:50.322 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:state without impacting any branches by switching back to a branch. 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: git switch -c 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:Or undo this operation with: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: git switch - 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T16:53:50.323 INFO:tasks.workunit.client.0.vm01.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-24T16:53:50.331 DEBUG:teuthology.orchestra.run.vm01:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-24T16:53:50.378 INFO:tasks.workunit.client.0.vm01.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-24T16:53:50.379 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-24T16:53:50.380 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-24T16:53:50.424 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-24T16:53:50.459 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-24T16:53:50.488 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-24T16:53:50.489 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-24T16:53:50.489 INFO:tasks.workunit.client.0.vm01.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-24T16:53:50.517 INFO:tasks.workunit.client.0.vm01.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-24T16:53:50.520 DEBUG:teuthology.orchestra.run.vm01:> set -ex 2026-03-24T16:53:50.520 DEBUG:teuthology.orchestra.run.vm01:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-24T16:53:50.565 INFO:tasks.workunit:Running workunits matching rbd/cli_generic.sh on client.0... 2026-03-24T16:53:50.566 INFO:tasks.workunit:Running workunit rbd/cli_generic.sh... 2026-03-24T16:53:50.566 DEBUG:teuthology.orchestra.run.vm01:workunit test rbd/cli_generic.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/cli_generic.sh 2026-03-24T16:53:50.613 INFO:tasks.workunit.client.0.vm01.stderr:+ export RBD_FORCE_ALLOW_V1=1 2026-03-24T16:53:50.613 INFO:tasks.workunit.client.0.vm01.stderr:+ RBD_FORCE_ALLOW_V1=1 2026-03-24T16:53:50.613 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:50.613 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:53:50.613 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v '^0$' 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ IMGS='testimg1 testimg2 testimg3 testimg4 testimg5 testimg6 testimg-diff1 testimg-diff2 testimg-diff3 foo foo2 bar bar2 test1 test2 test3 test4 clone2' 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ tiered=0 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd dump 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^pool' 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ''\''rbd'\''' 2026-03-24T16:53:50.641 INFO:tasks.workunit.client.0.vm01.stderr:+ grep tier 2026-03-24T16:53:50.893 INFO:tasks.workunit.client.0.vm01.stderr:+ test_pool_image_args 2026-03-24T16:53:50.893 INFO:tasks.workunit.client.0.vm01.stdout:testing pool and image args... 2026-03-24T16:53:50.893 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing pool and image args...' 2026-03-24T16:53:50.893 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T16:53:50.893 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:50.961 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.028 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.093 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.156 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.225 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.491 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.554 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.620 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.683 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.745 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.809 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.877 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:51.942 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:52.004 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:52.068 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:52.131 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:52.196 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:53:52.257 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-24T16:53:52.482 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test' does not exist 2026-03-24T16:53:52.496 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create test 32 2026-03-24T16:53:52.946 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test' already exists 2026-03-24T16:53:52.960 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init test 2026-03-24T16:53:55.920 INFO:tasks.workunit.client.0.vm01.stderr:+ truncate -s 1 /tmp/empty /tmp/empty@snap 2026-03-24T16:53:55.922 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:55.922 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:53:55.922 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T16:53:55.949 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T16:53:55.949 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 1 test1 2026-03-24T16:53:55.979 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:53:55.977+0000 7fbd9a803200 -1 librbd: Forced V1 image creation. 2026-03-24T16:53:58.259 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:58.260 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test1 2026-03-24T16:53:58.283 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import --image test2 /tmp/empty 2026-03-24T16:53:58.307 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:53:58.305+0000 7fa2ade1b200 -1 librbd: Forced V1 image creation. 2026-03-24T16:53:58.316 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:53:58.316 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:53:58.320 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:58.320 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test2 2026-03-24T16:53:58.347 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --dest test3 import /tmp/empty 2026-03-24T16:53:58.369 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:53:58.365+0000 7f5791260200 -1 librbd: Forced V1 image creation. 2026-03-24T16:53:58.376 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:53:58.380 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:58.380 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test3 2026-03-24T16:53:58.407 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import /tmp/empty foo 2026-03-24T16:53:58.430 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:53:58.429+0000 7f35ef8ff200 -1 librbd: Forced V1 image creation. 2026-03-24T16:53:58.437 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:53:58.441 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:53:58.441 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q foo 2026-03-24T16:53:58.468 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import --dest test/empty@snap /tmp/empty 2026-03-24T16:53:58.484 INFO:tasks.workunit.client.0.vm01.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T16:53:58.486 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T16:53:58.486 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import /tmp/empty test/empty@snap 2026-03-24T16:53:58.500 INFO:tasks.workunit.client.0.vm01.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T16:53:58.501 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T16:53:58.501 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import --image test/empty@snap /tmp/empty 2026-03-24T16:53:58.516 INFO:tasks.workunit.client.0.vm01.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T16:53:58.516 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:53:58.517 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T16:53:58.517 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import /tmp/empty@snap 2026-03-24T16:53:58.530 INFO:tasks.workunit.client.0.vm01.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T16:53:58.531 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T16:53:58.532 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:53:58.532 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:53:58.532 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T16:53:58.559 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T16:53:58.559 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import /tmp/empty test/test1 2026-03-24T16:53:58.581 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:53:58.577+0000 7f3b86ca6200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.275 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.279 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.279 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test1 2026-03-24T16:54:00.305 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p test import /tmp/empty test2 2026-03-24T16:54:00.329 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.325+0000 7f1ac828e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.338 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.338 INFO:tasks.workunit.client.0.vm01.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T16:54:00.342 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.343 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test2 2026-03-24T16:54:00.367 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --image test3 -p test import /tmp/empty 2026-03-24T16:54:00.390 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.389+0000 7f92f99d1200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.398 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.399 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:54:00.399 INFO:tasks.workunit.client.0.vm01.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T16:54:00.403 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.403 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test3 2026-03-24T16:54:00.429 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --image test4 -p test import /tmp/empty 2026-03-24T16:54:00.454 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.449+0000 7f8c19a25200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.463 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.464 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:54:00.464 INFO:tasks.workunit.client.0.vm01.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T16:54:00.467 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.467 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test4 2026-03-24T16:54:00.494 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --dest test5 -p test import /tmp/empty 2026-03-24T16:54:00.520 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.517+0000 7f8b44a09200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.529 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.529 INFO:tasks.workunit.client.0.vm01.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T16:54:00.533 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.533 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test5 2026-03-24T16:54:00.560 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --dest test6 --dest-pool test import /tmp/empty 2026-03-24T16:54:00.586 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.585+0000 7f8238af0200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.595 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.599 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.599 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test6 2026-03-24T16:54:00.627 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --image test7 --dest-pool test import /tmp/empty 2026-03-24T16:54:00.655 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.653+0000 7f14634cd200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.664 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.665 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:54:00.668 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.668 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test7 2026-03-24T16:54:00.696 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --image test/test8 import /tmp/empty 2026-03-24T16:54:00.722 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.717+0000 7f59ea51e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.730 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.731 INFO:tasks.workunit.client.0.vm01.stderr:rbd: --image is deprecated, use --dest 2026-03-24T16:54:00.734 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.735 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test8 2026-03-24T16:54:00.762 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --dest test/test9 import /tmp/empty 2026-03-24T16:54:00.785 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.781+0000 7f2799932200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.792 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.796 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.796 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test9 2026-03-24T16:54:00.823 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import --pool test /tmp/empty 2026-03-24T16:54:00.847 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:00.845+0000 7f4eb41c9200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:00.856 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T16:54:00.857 INFO:tasks.workunit.client.0.vm01.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T16:54:00.860 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.860 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q empty 2026-03-24T16:54:00.888 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy test/test9 test10 2026-03-24T16:54:00.928 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T16:54:00.933 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:00.933 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -qv test10 2026-03-24T16:54:00.959 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:00.959 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test10 2026-03-24T16:54:00.986 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy test/test9 test/test10 2026-03-24T16:54:01.025 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T16:54:01.029 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:01.030 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test10 2026-03-24T16:54:01.056 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy --pool test test10 --dest-pool test test11 2026-03-24T16:54:01.099 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T16:54:01.104 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:01.104 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -q test11 2026-03-24T16:54:01.132 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy --dest-pool rbd --pool test test11 test12 2026-03-24T16:54:01.174 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T16:54:01.178 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:01.178 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test12 2026-03-24T16:54:01.206 INFO:tasks.workunit.client.0.vm01.stdout:test12 2026-03-24T16:54:01.206 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls test 2026-03-24T16:54:01.206 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -qv test12 2026-03-24T16:54:01.234 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f /tmp/empty /tmp/empty@snap 2026-03-24T16:54:01.235 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-24T16:54:01.761 INFO:tasks.workunit.client.0.vm01.stderr:pool 'test' does not exist 2026-03-24T16:54:01.775 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.775 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm foo 2026-03-24T16:54:01.807 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:01.810 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.810 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T16:54:01.840 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:01.842 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.842 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test10 2026-03-24T16:54:01.893 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:01.896 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.896 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test12 2026-03-24T16:54:01.948 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:01.952 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.952 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T16:54:01.983 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:01.986 INFO:tasks.workunit.client.0.vm01.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T16:54:01.986 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test3 2026-03-24T16:54:02.016 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:02.019 INFO:tasks.workunit.client.0.vm01.stderr:+ test_rename 2026-03-24T16:54:02.019 INFO:tasks.workunit.client.0.vm01.stdout:testing rename... 2026-03-24T16:54:02.019 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing rename...' 2026-03-24T16:54:02.019 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T16:54:02.019 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.077 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.168 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.235 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.304 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.575 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.654 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.717 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:02.955 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.020 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.086 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.158 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.228 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.327 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.391 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.455 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.527 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.588 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:03.857 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 foo 2026-03-24T16:54:03.877 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T16:54:03.886 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:03.885+0000 7f7a4ab04200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:03.895 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 bar 2026-03-24T16:54:03.936 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename foo foo2 2026-03-24T16:54:03.979 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename foo2 bar 2026-03-24T16:54:03.979 INFO:tasks.workunit.client.0.vm01.stderr:+ grep exists 2026-03-24T16:54:04.018 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24T16:54:04.009+0000 7fdeb2dd6200 -1 librbd::Operations: rbd image bar already exists 2026-03-24T16:54:04.018 INFO:tasks.workunit.client.0.vm01.stdout:rbd: rename error: (17) File exists 2026-03-24T16:54:04.018 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename bar bar2 2026-03-24T16:54:04.065 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename bar2 foo2 2026-03-24T16:54:04.065 INFO:tasks.workunit.client.0.vm01.stderr:+ grep exists 2026-03-24T16:54:04.100 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24T16:54:04.093+0000 7f0f178f4200 -1 librbd::Operations: rbd image foo2 already exists 2026-03-24T16:54:04.100 INFO:tasks.workunit.client.0.vm01.stdout:rbd: rename error: (17) File exists 2026-03-24T16:54:04.101 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T16:54:05.338 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T16:54:05.353 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T16:54:07.706 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -p rbd2 -s 1 foo 2026-03-24T16:54:07.734 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:07.733+0000 7f3817360200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:09.798 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename rbd2/foo rbd2/bar 2026-03-24T16:54:09.836 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p rbd2 ls 2026-03-24T16:54:09.836 INFO:tasks.workunit.client.0.vm01.stderr:+ grep bar 2026-03-24T16:54:09.864 INFO:tasks.workunit.client.0.vm01.stdout:bar 2026-03-24T16:54:09.864 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename rbd2/bar foo 2026-03-24T16:54:09.904 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename --pool rbd2 foo bar 2026-03-24T16:54:09.945 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename rbd2/bar --dest-pool rbd foo 2026-03-24T16:54:09.961 INFO:tasks.workunit.client.0.vm01.stderr:rbd: mv/rename across pools not supported 2026-03-24T16:54:09.961 INFO:tasks.workunit.client.0.vm01.stderr:source pool: rbd2 dest pool: rbd 2026-03-24T16:54:09.962 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rename --pool rbd2 bar --dest-pool rbd2 foo 2026-03-24T16:54:10.004 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p rbd2 ls 2026-03-24T16:54:10.004 INFO:tasks.workunit.client.0.vm01.stderr:+ grep foo 2026-03-24T16:54:10.033 INFO:tasks.workunit.client.0.vm01.stdout:foo 2026-03-24T16:54:10.033 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T16:54:10.850 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T16:54:10.863 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T16:54:10.863 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:10.929 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:10.991 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.051 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.119 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.183 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.249 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.312 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.379 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.449 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.513 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.584 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.647 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.762 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.829 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.889 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:11.952 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.018 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.284 INFO:tasks.workunit.client.0.vm01.stderr:+ test_ls 2026-03-24T16:54:12.284 INFO:tasks.workunit.client.0.vm01.stdout:testing ls... 2026-03-24T16:54:12.285 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing ls...' 2026-03-24T16:54:12.285 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T16:54:12.285 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.346 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.609 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.674 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.737 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.802 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.867 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.932 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:12.998 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.062 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.123 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.389 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.452 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.519 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.584 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.648 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.714 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.780 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:13.835 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T16:54:13.851 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T16:54:13.857 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:13.853+0000 7ffae00f5200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:13.863 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-24T16:54:13.877 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T16:54:13.884 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:13.881+0000 7f85b7ad2200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:13.889 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:13.889 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T16:54:13.912 INFO:tasks.workunit.client.0.vm01.stdout:test1 2026-03-24T16:54:13.912 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:13.912 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T16:54:13.933 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T16:54:13.933 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:13.933 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:13.934 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T16:54:13.954 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T16:54:13.955 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:13.955 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*1 MiB.*1' 2026-03-24T16:54:13.979 INFO:tasks.workunit.client.0.vm01.stdout:test1 1 MiB 1 2026-03-24T16:54:13.979 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:13.979 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-24T16:54:14.003 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 1 2026-03-24T16:54:14.003 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T16:54:14.033 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:14.035 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T16:54:14.065 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:14.069 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T16:54:14.103 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T16:54:14.138 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:14.138 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T16:54:14.162 INFO:tasks.workunit.client.0.vm01.stdout:test1 2026-03-24T16:54:14.162 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:14.162 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T16:54:14.185 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T16:54:14.185 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:14.186 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:14.186 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T16:54:14.209 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T16:54:14.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:14.209 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-24T16:54:14.241 INFO:tasks.workunit.client.0.vm01.stdout:test1 1 MiB 2 2026-03-24T16:54:14.241 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:14.241 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*1 MiB.*2' 2026-03-24T16:54:14.273 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 2 2026-03-24T16:54:14.273 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T16:54:14.336 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:14.333+0000 7f2c2f243640 0 -- 192.168.123.101:0/2220185033 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55a8dfcfc150 msgr2=0x55a8dfcb83c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T16:54:14.338 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:14.342 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T16:54:14.409 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:14.405+0000 7f7625788640 0 -- 192.168.123.101:0/679271901 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f7608008d30 msgr2=0x7f76080291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T16:54:14.415 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:14.419 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T16:54:14.859 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-24T16:54:14.875 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T16:54:14.892 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:14.877+0000 7fc5b3ce8200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:15.037 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:15.037 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T16:54:15.063 INFO:tasks.workunit.client.0.vm01.stdout:test1 2026-03-24T16:54:15.063 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:15.063 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T16:54:15.090 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T16:54:15.090 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:15.090 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:15.091 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T16:54:15.114 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T16:54:15.157 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:15.157 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-24T16:54:15.157 INFO:tasks.workunit.client.0.vm01.stdout:test1 1 MiB 2 2026-03-24T16:54:15.157 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:15.157 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-24T16:54:15.171 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 1 2026-03-24T16:54:15.171 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T16:54:15.171 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.265 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.326 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.392 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.460 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.526 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.590 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.655 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.717 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.781 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.905 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:15.968 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.030 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.129 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.198 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.265 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.330 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T16:54:16.395 INFO:tasks.workunit.client.0.vm01.stderr:++ seq -w 00 99 2026-03-24T16:54:16.396 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.396 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.00 -s 1 2026-03-24T16:54:16.418 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.413+0000 7f8d4217e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.424 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.424 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.01 -s 1 2026-03-24T16:54:16.446 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.445+0000 7f810ecf6200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.452 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.452 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.02 -s 1 2026-03-24T16:54:16.475 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.473+0000 7f43020d7200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.480 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.481 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.03 -s 1 2026-03-24T16:54:16.505 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.501+0000 7f0c21200200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.511 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.511 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.04 -s 1 2026-03-24T16:54:16.534 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.533+0000 7fe204e16200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.541 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.541 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.05 -s 1 2026-03-24T16:54:16.565 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.561+0000 7ffb5df84200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.572 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.572 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.06 -s 1 2026-03-24T16:54:16.594 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.589+0000 7fd7ef2a3200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.07 -s 1 2026-03-24T16:54:16.623 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.621+0000 7f70a0fe7200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.630 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.630 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.08 -s 1 2026-03-24T16:54:16.653 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.649+0000 7f64bcda3200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.660 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.660 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.09 -s 1 2026-03-24T16:54:16.683 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.681+0000 7fc34264a200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.691 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.691 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.10 -s 1 2026-03-24T16:54:16.714 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.713+0000 7f1e372d8200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.721 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.721 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.11 -s 1 2026-03-24T16:54:16.745 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.741+0000 7f529008d200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.752 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.753 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.12 -s 1 2026-03-24T16:54:16.776 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.773+0000 7f7d5d25d200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.783 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.783 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.13 -s 1 2026-03-24T16:54:16.808 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.805+0000 7f1e14783200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.815 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.815 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.14 -s 1 2026-03-24T16:54:16.838 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.837+0000 7f1c87084200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.845 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.845 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.15 -s 1 2026-03-24T16:54:16.869 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.865+0000 7fd518344200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.876 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.876 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.16 -s 1 2026-03-24T16:54:16.899 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.897+0000 7fd788739200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.905 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.905 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.17 -s 1 2026-03-24T16:54:16.928 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.925+0000 7fb0002e3200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.935 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.935 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.18 -s 1 2026-03-24T16:54:16.959 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.957+0000 7f98971ea200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.966 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.966 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.19 -s 1 2026-03-24T16:54:16.989 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:16.985+0000 7f3b59f8f200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:16.996 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:16.996 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.20 -s 1 2026-03-24T16:54:17.020 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.017+0000 7f5958ce2200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.027 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.027 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.21 -s 1 2026-03-24T16:54:17.052 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.049+0000 7f5c9652a200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.060 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.060 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.22 -s 1 2026-03-24T16:54:17.082 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.077+0000 7fe9829f2200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.087 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.087 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.23 -s 1 2026-03-24T16:54:17.109 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.105+0000 7f07266c7200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.116 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.116 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.24 -s 1 2026-03-24T16:54:17.142 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.141+0000 7f7b0b89e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.148 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.148 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.25 -s 1 2026-03-24T16:54:17.171 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.169+0000 7fb3876de200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.178 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.178 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.26 -s 1 2026-03-24T16:54:17.198 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.197+0000 7f3910415200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.205 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.205 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.27 -s 1 2026-03-24T16:54:17.231 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.229+0000 7ff0266aa200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.238 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.238 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.28 -s 1 2026-03-24T16:54:17.265 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.261+0000 7f02fd198200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.272 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.272 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.29 -s 1 2026-03-24T16:54:17.297 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.293+0000 7f0514e68200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.304 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.304 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.30 -s 1 2026-03-24T16:54:17.527 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.525+0000 7fd323ef3200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.534 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.534 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.31 -s 1 2026-03-24T16:54:17.558 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.553+0000 7f16283ff200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.565 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.565 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.32 -s 1 2026-03-24T16:54:17.589 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.585+0000 7f5239e44200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.596 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.596 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.33 -s 1 2026-03-24T16:54:17.620 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.617+0000 7f501048f200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.626 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.626 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.34 -s 1 2026-03-24T16:54:17.649 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.645+0000 7f1002ef7200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.656 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.656 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.35 -s 1 2026-03-24T16:54:17.680 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.677+0000 7f6fd9e56200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.687 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.687 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.36 -s 1 2026-03-24T16:54:17.710 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.709+0000 7f8a38ae5200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.716 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.716 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.37 -s 1 2026-03-24T16:54:17.737 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.733+0000 7f401df6d200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.742 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.742 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.38 -s 1 2026-03-24T16:54:17.765 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.761+0000 7efec697d200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.772 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.772 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.39 -s 1 2026-03-24T16:54:17.798 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.797+0000 7f6fa6483200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.804 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.804 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.40 -s 1 2026-03-24T16:54:17.828 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.825+0000 7fac6e381200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.836 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.836 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.41 -s 1 2026-03-24T16:54:17.903 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.901+0000 7fac4dd3c200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.911 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.911 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.42 -s 1 2026-03-24T16:54:17.934 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.933+0000 7f842ed15200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.941 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.941 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.43 -s 1 2026-03-24T16:54:17.964 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.961+0000 7f8f26729200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:17.970 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:17.970 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.44 -s 1 2026-03-24T16:54:17.993 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:17.989+0000 7f83f60f5200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.001 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.001 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.45 -s 1 2026-03-24T16:54:18.024 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.021+0000 7fe6e0bb4200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.030 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.030 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.46 -s 1 2026-03-24T16:54:18.053 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.049+0000 7fd4cfbee200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.059 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.47 -s 1 2026-03-24T16:54:18.082 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.081+0000 7ff97c61c200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.089 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.089 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.48 -s 1 2026-03-24T16:54:18.115 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.113+0000 7fe70ea04200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.123 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.123 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.49 -s 1 2026-03-24T16:54:18.146 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.141+0000 7fb0a1cd0200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.151 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.151 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.50 -s 1 2026-03-24T16:54:18.173 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.169+0000 7f8176173200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.179 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.179 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.51 -s 1 2026-03-24T16:54:18.202 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.201+0000 7fc51a37f200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.210 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.210 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.52 -s 1 2026-03-24T16:54:18.235 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.233+0000 7fcdda316200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.245 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.245 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.53 -s 1 2026-03-24T16:54:18.268 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.265+0000 7fcab1688200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.276 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.276 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.54 -s 1 2026-03-24T16:54:18.300 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.297+0000 7f7fa440e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.306 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.307 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.55 -s 1 2026-03-24T16:54:18.334 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.333+0000 7fe922505200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.341 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.341 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.56 -s 1 2026-03-24T16:54:18.365 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.361+0000 7fa5f1196200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.372 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.372 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.57 -s 1 2026-03-24T16:54:18.402 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.401+0000 7f6476a36200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.413 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.413 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.58 -s 1 2026-03-24T16:54:18.437 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.433+0000 7faa9c952200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.446 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.446 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.59 -s 1 2026-03-24T16:54:18.470 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.469+0000 7f3addc05200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.477 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.477 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.60 -s 1 2026-03-24T16:54:18.500 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.497+0000 7f3d1316a200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.508 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.508 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.61 -s 1 2026-03-24T16:54:18.531 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.529+0000 7f46f2b66200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.543 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.543 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.62 -s 1 2026-03-24T16:54:18.566 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.561+0000 7f9298c89200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.573 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.573 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.63 -s 1 2026-03-24T16:54:18.596 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.593+0000 7f9fbef1f200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.604 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.604 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.64 -s 1 2026-03-24T16:54:18.629 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.625+0000 7fb25204a200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.636 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.636 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.65 -s 1 2026-03-24T16:54:18.660 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.657+0000 7f840c116200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.666 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.666 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.66 -s 1 2026-03-24T16:54:18.690 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.689+0000 7f08aa999200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.696 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.696 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.67 -s 1 2026-03-24T16:54:18.718 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.717+0000 7fe059a66200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.724 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.724 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.68 -s 1 2026-03-24T16:54:18.745 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.741+0000 7f9d600f9200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.750 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.750 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.69 -s 1 2026-03-24T16:54:18.771 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.769+0000 7f3bd2d02200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.776 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.776 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.70 -s 1 2026-03-24T16:54:18.802 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.797+0000 7fb7f7606200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.810 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.810 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.71 -s 1 2026-03-24T16:54:18.834 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.829+0000 7f7600169200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.840 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.72 -s 1 2026-03-24T16:54:18.863 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.861+0000 7f5d21a65200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.871 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.871 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.73 -s 1 2026-03-24T16:54:18.895 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:18.893+0000 7f9b88571200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:18.901 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.74 -s 1 2026-03-24T16:54:19.068 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.065+0000 7f76f6c71200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.073 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.073 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.75 -s 1 2026-03-24T16:54:19.098 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.093+0000 7effd29d6200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.105 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.105 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.76 -s 1 2026-03-24T16:54:19.126 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.125+0000 7febe1758200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.131 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.131 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.77 -s 1 2026-03-24T16:54:19.152 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.149+0000 7f2adb712200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.156 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.156 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.78 -s 1 2026-03-24T16:54:19.177 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.173+0000 7f8d83105200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.182 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.182 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.79 -s 1 2026-03-24T16:54:19.202 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.201+0000 7f4fa8dbf200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.207 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.207 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.80 -s 1 2026-03-24T16:54:19.227 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.225+0000 7fc8d1835200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.232 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.232 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.81 -s 1 2026-03-24T16:54:19.253 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.249+0000 7f8880a0c200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.258 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.258 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.82 -s 1 2026-03-24T16:54:19.280 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.277+0000 7fe2f43fa200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.288 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.288 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.83 -s 1 2026-03-24T16:54:19.309 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.305+0000 7f4442e73200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.315 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.315 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.84 -s 1 2026-03-24T16:54:19.339 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.337+0000 7fad01dfe200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.346 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.346 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.85 -s 1 2026-03-24T16:54:19.369 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.365+0000 7fd741c75200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.375 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.375 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.86 -s 1 2026-03-24T16:54:19.397 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.393+0000 7fa02be79200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.404 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.404 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.87 -s 1 2026-03-24T16:54:19.427 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.425+0000 7f7f8948e200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.433 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.433 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.88 -s 1 2026-03-24T16:54:19.456 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.453+0000 7fb948c61200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.464 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.464 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.89 -s 1 2026-03-24T16:54:19.488 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.485+0000 7fc3ea7d2200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.493 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.493 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.90 -s 1 2026-03-24T16:54:19.517 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.513+0000 7f3c97bfc200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.522 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.522 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.91 -s 1 2026-03-24T16:54:19.542 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.541+0000 7f4f387b8200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.547 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.547 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.92 -s 1 2026-03-24T16:54:19.571 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.569+0000 7f38895f2200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.579 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.579 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.93 -s 1 2026-03-24T16:54:19.601 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.597+0000 7fde531d4200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.609 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.609 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.94 -s 1 2026-03-24T16:54:19.632 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.629+0000 7fee724b9200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.640 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.640 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.95 -s 1 2026-03-24T16:54:19.662 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.661+0000 7f522ab6b200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.669 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.669 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.96 -s 1 2026-03-24T16:54:19.691 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.689+0000 7fcec8ff5200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.700 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.700 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.97 -s 1 2026-03-24T16:54:19.725 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.721+0000 7f7af0d5c200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.731 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.731 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.98 -s 1 2026-03-24T16:54:19.754 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.749+0000 7f4276716200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.761 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.761 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.99 -s 1 2026-03-24T16:54:19.786 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.785+0000 7f8dfbb43200 -1 librbd: Forced V1 image creation. 2026-03-24T16:54:19.793 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:19.793 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:19.793 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 100 2026-03-24T16:54:19.822 INFO:tasks.workunit.client.0.vm01.stdout:100 2026-03-24T16:54:19.822 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:19.822 INFO:tasks.workunit.client.0.vm01.stderr:+ grep image 2026-03-24T16:54:19.822 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:19.822 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 100 2026-03-24T16:54:19.869 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.865+0000 7f723b7fe640 0 -- 192.168.123.101:0/2798685029 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f7218030b70 msgr2=0x7f72180311a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:19.875 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:19.873+0000 7f7241a67640 0 -- 192.168.123.101:0/2798685029 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55bcf5137aa0 msgr2=0x55bcf51780a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:19.984 INFO:tasks.workunit.client.0.vm01.stdout:100 2026-03-24T16:54:19.984 INFO:tasks.workunit.client.0.vm01.stderr:++ seq -w 00 99 2026-03-24T16:54:19.985 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:19.985 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.00 2026-03-24T16:54:20.017 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.021 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.021 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.01 2026-03-24T16:54:20.052 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.055 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.055 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.02 2026-03-24T16:54:20.086 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.089 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.089 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.03 2026-03-24T16:54:20.118 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.122 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.122 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.04 2026-03-24T16:54:20.154 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.158 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.158 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.05 2026-03-24T16:54:20.188 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.192 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.192 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.06 2026-03-24T16:54:20.222 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.226 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.226 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.07 2026-03-24T16:54:20.257 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.261 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.261 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.08 2026-03-24T16:54:20.293 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.297 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.297 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.09 2026-03-24T16:54:20.328 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.331 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.331 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.10 2026-03-24T16:54:20.366 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.370 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.370 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.11 2026-03-24T16:54:20.402 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.406 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.406 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.12 2026-03-24T16:54:20.438 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.442 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.442 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.13 2026-03-24T16:54:20.475 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.479 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.479 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.14 2026-03-24T16:54:20.512 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.515 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.515 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.15 2026-03-24T16:54:20.547 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.551 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.551 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.16 2026-03-24T16:54:20.581 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.585 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.585 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.17 2026-03-24T16:54:20.616 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.619 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.619 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.18 2026-03-24T16:54:20.649 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.653 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.653 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.19 2026-03-24T16:54:20.688 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.692 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.692 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.20 2026-03-24T16:54:20.726 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.729 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.729 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.21 2026-03-24T16:54:20.759 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.762 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.763 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.22 2026-03-24T16:54:20.793 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.797 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.797 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.23 2026-03-24T16:54:20.834 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.837 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.837 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.24 2026-03-24T16:54:20.870 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.873 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.873 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.25 2026-03-24T16:54:20.907 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.911 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.911 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.26 2026-03-24T16:54:20.945 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.949 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.949 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.27 2026-03-24T16:54:20.983 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:20.987 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:20.987 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.28 2026-03-24T16:54:21.020 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.024 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.024 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.29 2026-03-24T16:54:21.055 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.059 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.30 2026-03-24T16:54:21.091 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.095 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.095 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.31 2026-03-24T16:54:21.127 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.130 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.130 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.32 2026-03-24T16:54:21.160 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.163 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.163 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.33 2026-03-24T16:54:21.192 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.195 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.195 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.34 2026-03-24T16:54:21.228 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.231 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.231 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.35 2026-03-24T16:54:21.263 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.267 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.267 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.36 2026-03-24T16:54:21.297 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.300 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.300 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.37 2026-03-24T16:54:21.329 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.331 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.331 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.38 2026-03-24T16:54:21.360 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.362 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.362 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.39 2026-03-24T16:54:21.407 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.410 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.410 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.40 2026-03-24T16:54:21.439 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.441 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.441 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.41 2026-03-24T16:54:21.469 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.471 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.471 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.42 2026-03-24T16:54:21.503 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.507 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.507 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.43 2026-03-24T16:54:21.535 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.537 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.537 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.44 2026-03-24T16:54:21.567 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.572 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.572 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.45 2026-03-24T16:54:21.605 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.608 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.608 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.46 2026-03-24T16:54:21.640 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.644 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.644 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.47 2026-03-24T16:54:21.677 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.681 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.681 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.48 2026-03-24T16:54:21.717 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.721 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.722 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.49 2026-03-24T16:54:21.763 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.767 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.767 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.50 2026-03-24T16:54:21.798 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.802 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.802 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.51 2026-03-24T16:54:21.836 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.840 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.52 2026-03-24T16:54:21.874 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.878 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.878 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.53 2026-03-24T16:54:21.909 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.913 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.913 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.54 2026-03-24T16:54:21.944 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.948 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.948 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.55 2026-03-24T16:54:21.979 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:21.983 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:21.983 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.56 2026-03-24T16:54:22.016 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.021 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.021 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.57 2026-03-24T16:54:22.053 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.057 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.057 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.58 2026-03-24T16:54:22.293 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.297 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.297 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.59 2026-03-24T16:54:22.328 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.332 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.332 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.60 2026-03-24T16:54:22.362 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.366 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.366 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.61 2026-03-24T16:54:22.502 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.506 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.506 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.62 2026-03-24T16:54:22.538 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.542 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.542 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.63 2026-03-24T16:54:22.603 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.607 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.607 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.64 2026-03-24T16:54:22.652 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.655 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.655 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.65 2026-03-24T16:54:22.690 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.693 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.693 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.66 2026-03-24T16:54:22.725 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.729 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.729 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.67 2026-03-24T16:54:22.762 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.765 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.765 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.68 2026-03-24T16:54:22.798 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.802 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.802 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.69 2026-03-24T16:54:22.838 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.841 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.841 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.70 2026-03-24T16:54:22.872 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.875 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.875 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.71 2026-03-24T16:54:22.907 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:22.911 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:22.912 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.72 2026-03-24T16:54:23.153 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.156 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.156 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.73 2026-03-24T16:54:23.245 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.249 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.249 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.74 2026-03-24T16:54:23.279 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.283 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.283 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.75 2026-03-24T16:54:23.314 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.317 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.317 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.76 2026-03-24T16:54:23.347 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.350 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.350 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.77 2026-03-24T16:54:23.381 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.384 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.384 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.78 2026-03-24T16:54:23.415 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.419 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.419 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.79 2026-03-24T16:54:23.467 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.474 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.474 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.80 2026-03-24T16:54:23.581 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.586 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.586 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.81 2026-03-24T16:54:23.636 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.646 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.646 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.82 2026-03-24T16:54:23.829 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.834 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.834 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.83 2026-03-24T16:54:23.865 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.868 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.868 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.84 2026-03-24T16:54:23.901 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.904 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.904 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.85 2026-03-24T16:54:23.940 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.944 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.944 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.86 2026-03-24T16:54:23.972 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:23.975 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:23.976 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.87 2026-03-24T16:54:24.008 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.011 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.011 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.88 2026-03-24T16:54:24.043 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.047 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.047 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.89 2026-03-24T16:54:24.079 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.083 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.083 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.90 2026-03-24T16:54:24.116 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.120 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.120 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.91 2026-03-24T16:54:24.153 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.157 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.157 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.92 2026-03-24T16:54:24.298 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.301 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.301 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.93 2026-03-24T16:54:24.336 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.339 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.339 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.94 2026-03-24T16:54:24.371 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.374 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.374 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.95 2026-03-24T16:54:24.408 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.410 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.411 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.96 2026-03-24T16:54:24.444 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.448 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.448 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.97 2026-03-24T16:54:24.481 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.486 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.486 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.98 2026-03-24T16:54:24.518 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.522 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.522 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.99 2026-03-24T16:54:24.554 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:24.557 INFO:tasks.workunit.client.0.vm01.stderr:++ seq -w 00 99 2026-03-24T16:54:24.558 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.558 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.00 --image-format 2 -s 1 2026-03-24T16:54:24.596 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.596 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.01 --image-format 2 -s 1 2026-03-24T16:54:24.633 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.633 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.02 --image-format 2 -s 1 2026-03-24T16:54:24.673 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.673 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.03 --image-format 2 -s 1 2026-03-24T16:54:24.709 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.709 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.04 --image-format 2 -s 1 2026-03-24T16:54:24.744 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.744 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.05 --image-format 2 -s 1 2026-03-24T16:54:24.779 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.779 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.06 --image-format 2 -s 1 2026-03-24T16:54:24.816 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.816 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.07 --image-format 2 -s 1 2026-03-24T16:54:24.852 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.852 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.08 --image-format 2 -s 1 2026-03-24T16:54:24.889 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.889 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.09 --image-format 2 -s 1 2026-03-24T16:54:24.925 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.925 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.10 --image-format 2 -s 1 2026-03-24T16:54:24.960 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.960 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.11 --image-format 2 -s 1 2026-03-24T16:54:24.999 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:24.999 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.12 --image-format 2 -s 1 2026-03-24T16:54:25.034 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.034 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.13 --image-format 2 -s 1 2026-03-24T16:54:25.070 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.071 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.14 --image-format 2 -s 1 2026-03-24T16:54:25.105 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.106 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.15 --image-format 2 -s 1 2026-03-24T16:54:25.141 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.141 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.16 --image-format 2 -s 1 2026-03-24T16:54:25.176 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.176 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.17 --image-format 2 -s 1 2026-03-24T16:54:25.213 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.213 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.18 --image-format 2 -s 1 2026-03-24T16:54:25.250 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.250 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.19 --image-format 2 -s 1 2026-03-24T16:54:25.286 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.286 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.20 --image-format 2 -s 1 2026-03-24T16:54:25.327 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.327 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.21 --image-format 2 -s 1 2026-03-24T16:54:25.363 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.363 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.22 --image-format 2 -s 1 2026-03-24T16:54:25.398 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.399 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.23 --image-format 2 -s 1 2026-03-24T16:54:25.434 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.434 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.24 --image-format 2 -s 1 2026-03-24T16:54:25.470 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.470 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.25 --image-format 2 -s 1 2026-03-24T16:54:25.507 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.507 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.26 --image-format 2 -s 1 2026-03-24T16:54:25.544 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.545 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.27 --image-format 2 -s 1 2026-03-24T16:54:25.581 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.581 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.28 --image-format 2 -s 1 2026-03-24T16:54:25.618 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.618 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.29 --image-format 2 -s 1 2026-03-24T16:54:25.653 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.653 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.30 --image-format 2 -s 1 2026-03-24T16:54:25.690 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.690 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.31 --image-format 2 -s 1 2026-03-24T16:54:25.726 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.726 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.32 --image-format 2 -s 1 2026-03-24T16:54:25.762 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.762 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.33 --image-format 2 -s 1 2026-03-24T16:54:25.795 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.795 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.34 --image-format 2 -s 1 2026-03-24T16:54:25.827 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.828 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.35 --image-format 2 -s 1 2026-03-24T16:54:25.869 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.869 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.36 --image-format 2 -s 1 2026-03-24T16:54:25.906 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.906 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.37 --image-format 2 -s 1 2026-03-24T16:54:25.943 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.943 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.38 --image-format 2 -s 1 2026-03-24T16:54:25.978 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:25.978 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.39 --image-format 2 -s 1 2026-03-24T16:54:26.015 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.015 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.40 --image-format 2 -s 1 2026-03-24T16:54:26.051 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.051 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.41 --image-format 2 -s 1 2026-03-24T16:54:26.085 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.086 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.42 --image-format 2 -s 1 2026-03-24T16:54:26.120 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.120 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.43 --image-format 2 -s 1 2026-03-24T16:54:26.155 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.155 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.44 --image-format 2 -s 1 2026-03-24T16:54:26.191 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.191 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.45 --image-format 2 -s 1 2026-03-24T16:54:26.226 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.226 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.46 --image-format 2 -s 1 2026-03-24T16:54:26.265 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.265 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.47 --image-format 2 -s 1 2026-03-24T16:54:26.304 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.304 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.48 --image-format 2 -s 1 2026-03-24T16:54:26.338 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.338 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.49 --image-format 2 -s 1 2026-03-24T16:54:26.373 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.50 --image-format 2 -s 1 2026-03-24T16:54:26.407 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.407 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.51 --image-format 2 -s 1 2026-03-24T16:54:26.445 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.445 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.52 --image-format 2 -s 1 2026-03-24T16:54:26.479 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.480 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.53 --image-format 2 -s 1 2026-03-24T16:54:26.511 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.511 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.54 --image-format 2 -s 1 2026-03-24T16:54:26.542 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.543 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.55 --image-format 2 -s 1 2026-03-24T16:54:26.579 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.579 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.56 --image-format 2 -s 1 2026-03-24T16:54:26.615 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.615 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.57 --image-format 2 -s 1 2026-03-24T16:54:26.652 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.652 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.58 --image-format 2 -s 1 2026-03-24T16:54:26.690 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.690 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.59 --image-format 2 -s 1 2026-03-24T16:54:26.724 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.724 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.60 --image-format 2 -s 1 2026-03-24T16:54:26.755 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.755 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.61 --image-format 2 -s 1 2026-03-24T16:54:26.810 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.810 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.62 --image-format 2 -s 1 2026-03-24T16:54:26.846 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.846 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.63 --image-format 2 -s 1 2026-03-24T16:54:26.880 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.881 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.64 --image-format 2 -s 1 2026-03-24T16:54:26.930 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.930 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.65 --image-format 2 -s 1 2026-03-24T16:54:26.967 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:26.967 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.66 --image-format 2 -s 1 2026-03-24T16:54:27.000 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.000 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.67 --image-format 2 -s 1 2026-03-24T16:54:27.038 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.038 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.68 --image-format 2 -s 1 2026-03-24T16:54:27.074 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.074 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.69 --image-format 2 -s 1 2026-03-24T16:54:27.111 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.111 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.70 --image-format 2 -s 1 2026-03-24T16:54:27.153 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.153 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.71 --image-format 2 -s 1 2026-03-24T16:54:27.188 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.188 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.72 --image-format 2 -s 1 2026-03-24T16:54:27.227 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.227 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.73 --image-format 2 -s 1 2026-03-24T16:54:27.264 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.265 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.74 --image-format 2 -s 1 2026-03-24T16:54:27.301 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.301 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.75 --image-format 2 -s 1 2026-03-24T16:54:27.533 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.534 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.76 --image-format 2 -s 1 2026-03-24T16:54:27.566 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.566 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.77 --image-format 2 -s 1 2026-03-24T16:54:27.601 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.601 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.78 --image-format 2 -s 1 2026-03-24T16:54:27.638 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.638 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.79 --image-format 2 -s 1 2026-03-24T16:54:27.674 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.674 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.80 --image-format 2 -s 1 2026-03-24T16:54:27.712 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.712 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.81 --image-format 2 -s 1 2026-03-24T16:54:27.749 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.750 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.82 --image-format 2 -s 1 2026-03-24T16:54:27.783 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.783 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.83 --image-format 2 -s 1 2026-03-24T16:54:27.821 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.821 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.84 --image-format 2 -s 1 2026-03-24T16:54:27.859 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.859 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.85 --image-format 2 -s 1 2026-03-24T16:54:27.896 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.896 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.86 --image-format 2 -s 1 2026-03-24T16:54:27.935 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.935 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.87 --image-format 2 -s 1 2026-03-24T16:54:27.974 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:27.974 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.88 --image-format 2 -s 1 2026-03-24T16:54:28.012 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.012 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.89 --image-format 2 -s 1 2026-03-24T16:54:28.050 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.050 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.90 --image-format 2 -s 1 2026-03-24T16:54:28.094 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.094 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.91 --image-format 2 -s 1 2026-03-24T16:54:28.131 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.131 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.92 --image-format 2 -s 1 2026-03-24T16:54:28.170 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.170 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.93 --image-format 2 -s 1 2026-03-24T16:54:28.207 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.207 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.94 --image-format 2 -s 1 2026-03-24T16:54:28.243 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.243 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.95 --image-format 2 -s 1 2026-03-24T16:54:28.279 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.279 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.96 --image-format 2 -s 1 2026-03-24T16:54:28.846 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.846 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.97 --image-format 2 -s 1 2026-03-24T16:54:28.883 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.883 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.98 --image-format 2 -s 1 2026-03-24T16:54:28.920 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:28.920 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create image.99 --image-format 2 -s 1 2026-03-24T16:54:28.955 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T16:54:28.955 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:28.955 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 100 2026-03-24T16:54:28.980 INFO:tasks.workunit.client.0.vm01.stdout:100 2026-03-24T16:54:28.980 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T16:54:28.980 INFO:tasks.workunit.client.0.vm01.stderr:+ grep image 2026-03-24T16:54:28.980 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T16:54:28.980 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 100 2026-03-24T16:54:29.015 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.013+0000 7f3ca0f21640 0 -- 192.168.123.101:0/2562296202 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55baaeb2ce70 msgr2=0x55baaeb1dde0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.018 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.013+0000 7f3c9fc98640 0 -- 192.168.123.101:0/2562296202 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f3c78008d30 msgr2=0x7f3c780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.247 INFO:tasks.workunit.client.0.vm01.stdout:100 2026-03-24T16:54:29.247 INFO:tasks.workunit.client.0.vm01.stderr:++ seq -w 00 99 2026-03-24T16:54:29.249 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.249 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.00 2026-03-24T16:54:29.330 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.325+0000 7fef7d98a640 0 -- 192.168.123.101:0/845115645 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5617c17975b0 msgr2=0x5617c17874b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.333 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.337 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.337 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.01 2026-03-24T16:54:29.612 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.609+0000 7fc28ebc3640 0 -- 192.168.123.101:0/785832522 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5649121f9320 msgr2=0x56491222e390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.615 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.619 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.620 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.02 2026-03-24T16:54:29.685 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.681+0000 7f733b7fe640 0 -- 192.168.123.101:0/1845477353 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f732405bd40 msgr2=0x7f732407c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.692 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.697 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.697 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.03 2026-03-24T16:54:29.768 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.765+0000 7f87dd121640 0 -- 192.168.123.101:0/1006858674 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x556bb439b5b0 msgr2=0x556bb438b4b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.771 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.776 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.776 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.04 2026-03-24T16:54:29.852 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:29.849+0000 7fc333fff640 0 -- 192.168.123.101:0/1666715554 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fc310004f90 msgr2=0x7fc31000f490 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:29.854 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.859 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.859 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.05 2026-03-24T16:54:29.934 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:29.938 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:29.938 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.06 2026-03-24T16:54:30.025 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.021+0000 7fb3fbc05640 0 -- 192.168.123.101:0/3414655978 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x562c34cd8320 msgr2=0x562c34cb6800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.026 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.030 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.031 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.07 2026-03-24T16:54:30.112 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.109+0000 7f24b8774640 0 -- 192.168.123.101:0/2669341905 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f249805bcf0 msgr2=0x7f249807c0d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.172 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.176 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.176 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.08 2026-03-24T16:54:30.249 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.245+0000 7f60d77fe640 0 -- 192.168.123.101:0/1234487175 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f60b800aa70 msgr2=0x7f60b80031f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.252 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.256 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.256 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.09 2026-03-24T16:54:30.324 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.321+0000 7f43c94e6640 0 -- 192.168.123.101:0/4127170190 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f43ac008d30 msgr2=0x7f43ac0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.327 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.331 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.331 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.10 2026-03-24T16:54:30.407 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.411 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.411 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.11 2026-03-24T16:54:30.489 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.485+0000 7f27edeb7640 0 -- 192.168.123.101:0/2456792861 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f27d0012d20 msgr2=0x7f27d0013190 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T16:54:30.493 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.497 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.497 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.12 2026-03-24T16:54:30.568 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.565+0000 7f6fc59df640 0 -- 192.168.123.101:0/184745657 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55b856dff320 msgr2=0x55b856ddd790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.572 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.577 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.577 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.13 2026-03-24T16:54:30.648 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.645+0000 7f00b0982640 0 -- 192.168.123.101:0/867222512 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f009005be10 msgr2=0x7f009007c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.656 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.660 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.660 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.14 2026-03-24T16:54:30.748 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.753 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.753 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.15 2026-03-24T16:54:30.824 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.821+0000 7ff5e9ed5640 0 -- 192.168.123.101:0/454652663 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff5cc008d30 msgr2=0x7ff5cc0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.828 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.833 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.833 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.16 2026-03-24T16:54:30.904 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.909 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.909 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.17 2026-03-24T16:54:30.981 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:30.977+0000 7fe75b537640 0 -- 192.168.123.101:0/2419073727 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e46ba11320 msgr2=0x55e46ba46390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:30.986 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:30.990 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:30.990 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.18 2026-03-24T16:54:31.062 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.061+0000 7fd80f346640 0 -- 192.168.123.101:0/3377056266 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55901345a360 msgr2=0x559013489bd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.066 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.070 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.070 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.19 2026-03-24T16:54:31.141 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.137+0000 7ff569dcb640 0 -- 192.168.123.101:0/3076644806 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559ad69cf5b0 msgr2=0x559ad69bf4b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.141 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.145 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.145 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.20 2026-03-24T16:54:31.232 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.229+0000 7f59c5165640 0 -- 192.168.123.101:0/967169089 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5612403fb320 msgr2=0x561240430390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.232 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.237 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.237 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.21 2026-03-24T16:54:31.319 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.317+0000 7f3b3f078640 0 -- 192.168.123.101:0/1161195249 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5625505a8320 msgr2=0x562550586800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.324 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.328 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.328 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.22 2026-03-24T16:54:31.396 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.393+0000 7fd80971c640 0 -- 192.168.123.101:0/548988888 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fd7e0002ee0 msgr2=0x7fd7e00271c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.404 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.409 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.409 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.23 2026-03-24T16:54:31.488 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.485+0000 7f8068b22640 0 -- 192.168.123.101:0/2049957805 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f804805bda0 msgr2=0x7f804807c180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T16:54:31.492 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.496 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.496 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.24 2026-03-24T16:54:31.567 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.565+0000 7f6b90540640 0 -- 192.168.123.101:0/549227232 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c96e4a7320 msgr2=0x55c96e485790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.569 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.573 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.573 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.25 2026-03-24T16:54:31.641 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.637+0000 7f4ed3e5b640 0 -- 192.168.123.101:0/4214242572 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x558521e7df70 msgr2=0x558521e9e3f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.644 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.648 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.648 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.26 2026-03-24T16:54:31.720 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.717+0000 7ff4a8b23640 0 -- 192.168.123.101:0/1703287495 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff480008d30 msgr2=0x7ff4800291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.724 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.729 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.729 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.27 2026-03-24T16:54:31.802 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.801+0000 7fd3e3ef6640 0 -- 192.168.123.101:0/3646541 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55685e2d5320 msgr2=0x55685e30a390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.803 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.808 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.808 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.28 2026-03-24T16:54:31.886 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.885+0000 7f8028cc0640 0 -- 192.168.123.101:0/3558369520 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5602ca391320 msgr2=0x5602ca36f800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.890 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.894 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.894 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.29 2026-03-24T16:54:31.964 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:31.961+0000 7fcfd5b3c640 0 -- 192.168.123.101:0/4072889529 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5556b1f2c320 msgr2=0x5556b1f0a800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:31.967 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:31.972 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:31.972 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.30 2026-03-24T16:54:32.045 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.041+0000 7fd1a37fe640 0 -- 192.168.123.101:0/500846840 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fd180012e20 msgr2=0x7fd180013290 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.047 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.051 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.051 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.31 2026-03-24T16:54:32.130 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.134 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.134 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.32 2026-03-24T16:54:32.212 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.209+0000 7f0947bc0640 0 -- 192.168.123.101:0/1825032612 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55de0685c320 msgr2=0x55de06891390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.212 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.216 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.216 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.33 2026-03-24T16:54:32.288 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.285+0000 7efda0ed4640 0 -- 192.168.123.101:0/1921577991 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7efd8405bc80 msgr2=0x7efd8407c060 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.298 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.302 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.302 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.34 2026-03-24T16:54:32.367 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.365+0000 7f461e7af640 0 -- 192.168.123.101:0/689064541 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f45f8008d30 msgr2=0x7f45f80291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.369 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.373 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.35 2026-03-24T16:54:32.450 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.449+0000 7f261cac1640 0 -- 192.168.123.101:0/3507663261 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5564fbf9b5b0 msgr2=0x5564fbf8b4b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T16:54:32.454 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.458 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.458 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.36 2026-03-24T16:54:32.529 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.525+0000 7fded5b49640 0 -- 192.168.123.101:0/775250767 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e5c7224320 msgr2=0x55e5c7202800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.529 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.533 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.533 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.37 2026-03-24T16:54:32.605 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.601+0000 7fa5b1638640 0 -- 192.168.123.101:0/3291851230 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55c784dd7e30 msgr2=0x55c784df82b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:54:32.609 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T16:54:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T16:54:32.613 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.38 2026-03-24T16:54:32.682 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:54:32.677+0000 7fb80b7fe640 0 -- 192.168.123.101:0/4072145479 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fb7e8008d30 msgr2=0x7fb7e80291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T16:57:32.653 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T16:57:32.649+0000 7fb80bfff640 0 -- 192.168.123.101:0/4072145479 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x557f519a7c50 msgr2=0x557f5199c710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:32.696 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:32.700 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:32.700 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.39 2026-03-24T17:09:32.760 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:32.763 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:32.763 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.40 2026-03-24T17:09:32.823 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:32.817+0000 7f6c87fff640 0 -- 192.168.123.101:0/1901672892 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x56505b753f30 msgr2=0x56505b7743b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:32.826 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:32.830 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:32.830 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.41 2026-03-24T17:09:32.898 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:32.889+0000 7fd222c3d640 0 -- 192.168.123.101:0/334837505 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55d1775d9dc0 msgr2=0x55d1776bd9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:32.898 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:32.902 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:32.902 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.42 2026-03-24T17:09:32.973 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:32.965+0000 7fc853141640 0 -- 192.168.123.101:0/4125326673 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55793effc320 msgr2=0x55793efda790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:32.976 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:32.980 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:32.980 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.43 2026-03-24T17:09:33.057 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.049+0000 7f9ef60e2640 0 -- 192.168.123.101:0/1966786005 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5572cc706360 msgr2=0x5572cc737f10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.059 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.063 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.063 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.44 2026-03-24T17:09:33.133 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.125+0000 7efeacfc7640 0 -- 192.168.123.101:0/2875966199 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c705654320 msgr2=0x55c705689390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:33.139 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.144 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.144 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.45 2026-03-24T17:09:33.213 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.205+0000 7f81c9635640 0 -- 192.168.123.101:0/998173094 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x563cb1e43320 msgr2=0x563cb1e78390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.218 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.222 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.222 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.46 2026-03-24T17:09:33.299 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.293+0000 7fc6a7d79640 0 -- 192.168.123.101:0/2595292217 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x556446289320 msgr2=0x556446267800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.302 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.307 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.307 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.47 2026-03-24T17:09:33.372 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.365+0000 7f2d9f05b640 0 -- 192.168.123.101:0/823234341 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f2d78008d30 msgr2=0x7f2d780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:33.377 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.381 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.381 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.48 2026-03-24T17:09:33.444 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.437+0000 7fe7f36de640 0 -- 192.168.123.101:0/2149351367 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe7cc008d30 msgr2=0x7fe7cc0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.446 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.450 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.450 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.49 2026-03-24T17:09:33.517 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.509+0000 7f97a566a640 0 -- 192.168.123.101:0/789557257 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559fe6324320 msgr2=0x559fe6302800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.520 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.524 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.524 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.50 2026-03-24T17:09:33.593 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.585+0000 7f38b6a6a640 0 -- 192.168.123.101:0/2435852484 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f388c006fe0 msgr2=0x7f388c0273c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.595 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.599 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.599 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.51 2026-03-24T17:09:33.664 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.657+0000 7f5ead75d640 0 -- 192.168.123.101:0/3143837386 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5613e05df320 msgr2=0x5613e0614390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.667 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.671 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.671 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.52 2026-03-24T17:09:33.741 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.733+0000 7fd27cbcf640 0 -- 192.168.123.101:0/4186311382 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c47d75c320 msgr2=0x55c47d791390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.742 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.746 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.746 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.53 2026-03-24T17:09:33.814 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.805+0000 7f6bfd1dc640 0 -- 192.168.123.101:0/1545092483 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x560f21fe5320 msgr2=0x560f21fc3790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.816 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.820 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.820 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.54 2026-03-24T17:09:33.899 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.893+0000 7f2e01856640 0 -- 192.168.123.101:0/2043600017 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f2de00057f0 msgr2=0x7f2de0005c60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.902 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.907 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.907 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.55 2026-03-24T17:09:33.978 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:33.973+0000 7f688f7e5640 0 -- 192.168.123.101:0/162555965 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55dac1e2d380 msgr2=0x55dac1e1f6e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:33.979 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:33.983 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:33.983 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.56 2026-03-24T17:09:34.053 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.045+0000 7f1a2d7b2640 0 -- 192.168.123.101:0/3114394109 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f1a04006500 msgr2=0x7f1a04006970 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.056 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.060 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.060 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.57 2026-03-24T17:09:34.126 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.121+0000 7f9558acb640 0 -- 192.168.123.101:0/3528190761 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f9530008d30 msgr2=0x7f95300291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.130 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.134 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.134 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.58 2026-03-24T17:09:34.204 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.197+0000 7f363b420640 0 -- 192.168.123.101:0/3888976098 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f361c012dd0 msgr2=0x7f361c013240 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.206 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.210 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.210 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.59 2026-03-24T17:09:34.276 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.269+0000 7fe76f3ee640 0 -- 192.168.123.101:0/1361501130 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559d7ae80320 msgr2=0x559d7ae5e790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.276 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.280 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.280 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.60 2026-03-24T17:09:34.350 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.345+0000 7f9fda33f640 0 -- 192.168.123.101:0/108879426 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e5c435f320 msgr2=0x55e5c4394390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.354 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.358 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.358 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.61 2026-03-24T17:09:34.419 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.413+0000 7f95fec92640 0 -- 192.168.123.101:0/4238146727 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5575fe6b8dc0 msgr2=0x5575fe79c9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.422 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.426 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.426 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.62 2026-03-24T17:09:34.498 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.489+0000 7f319e8a6640 0 -- 192.168.123.101:0/607441581 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55eb6292f320 msgr2=0x55eb6290d800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.498 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.502 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.502 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.63 2026-03-24T17:09:34.574 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.565+0000 7fe98e195640 0 -- 192.168.123.101:0/1182346826 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55ee4f314320 msgr2=0x55ee4f349390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.577 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.581 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.581 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.64 2026-03-24T17:09:34.655 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.649+0000 7f1c4e453640 0 -- 192.168.123.101:0/3654494790 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f1c2c006d40 msgr2=0x7f1c2c0271c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:34.658 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.662 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.662 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.65 2026-03-24T17:09:34.733 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.737 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.737 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.66 2026-03-24T17:09:34.800 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.793+0000 7f0ffa16a640 0 -- 192.168.123.101:0/3903089561 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f0fd8001910 msgr2=0x7f0fd8027bf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:34.804 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:34.807 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:34.807 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.67 2026-03-24T17:09:34.877 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:34.869+0000 7fdf08cf4640 0 -- 192.168.123.101:0/3022735172 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fdee805bcf0 msgr2=0x7fdee807c0d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:35.081 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:35.085 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:35.085 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.68 2026-03-24T17:09:35.149 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:35.141+0000 7f2740a27640 0 -- 192.168.123.101:0/1249852663 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f2718008d70 msgr2=0x7f27180291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:35.151 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:35.155 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:35.155 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.69 2026-03-24T17:09:35.429 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:35.421+0000 7fc9cf28e640 0 -- 192.168.123.101:0/3209512737 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fc9ac048dd0 msgr2=0x7fc9ac00b290 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:35.635 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:35.639 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:35.639 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.70 2026-03-24T17:09:35.709 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:35.701+0000 7f28c0ce3640 0 -- 192.168.123.101:0/2524169353 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f289c006c40 msgr2=0x7f289c027020 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:35.712 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:35.717 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:35.717 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.71 2026-03-24T17:09:35.789 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:35.781+0000 7fbf914f5640 0 -- 192.168.123.101:0/1546650099 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fbf680042f0 msgr2=0x7fbf6800ab30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:35.995 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:35.999 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:35.999 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.72 2026-03-24T17:09:36.069 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.061+0000 7feb2f019640 0 -- 192.168.123.101:0/2786889249 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559ff957d5b0 msgr2=0x559ff9571800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.072 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.076 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.076 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.73 2026-03-24T17:09:36.157 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.149+0000 7f89f77fe640 0 -- 192.168.123.101:0/3073342851 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f89d4008d30 msgr2=0x7f89d40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.163 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.168 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.168 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.74 2026-03-24T17:09:36.241 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.233+0000 7f1a1bbd6640 0 -- 192.168.123.101:0/2294530836 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x563bde86ddc0 msgr2=0x563bde9719e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.241 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.245 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.245 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.75 2026-03-24T17:09:36.338 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.329+0000 7ff93140c640 0 -- 192.168.123.101:0/2848628748 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff914008d30 msgr2=0x7ff9140291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.342 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.347 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.347 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.76 2026-03-24T17:09:36.413 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.405+0000 7fce237fe640 0 -- 192.168.123.101:0/613917186 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fce04040e20 msgr2=0x7fce04000da0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.416 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.420 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.420 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.77 2026-03-24T17:09:36.486 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.489 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.489 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.78 2026-03-24T17:09:36.557 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.549+0000 7f5f9eef2640 0 -- 192.168.123.101:0/1751207557 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f5f7c05bd40 msgr2=0x7f5f7c07c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:36.561 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.565 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.565 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.79 2026-03-24T17:09:36.645 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.649 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.649 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.80 2026-03-24T17:09:36.815 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.809+0000 7f878a242640 0 -- 192.168.123.101:0/2892551149 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55a351d31010 msgr2=0x55a351d51490 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.817 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.821 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.821 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.81 2026-03-24T17:09:36.895 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.889+0000 7f622359c640 0 -- 192.168.123.101:0/864385416 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f61fc008d30 msgr2=0x7f61fc0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.899 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.903 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.903 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.82 2026-03-24T17:09:36.974 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:36.965+0000 7ffa3c399640 0 -- 192.168.123.101:0/106565805 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55ee53b43320 msgr2=0x55ee53b21800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:36.976 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:36.979 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:36.979 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.83 2026-03-24T17:09:37.057 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.049+0000 7f67228a8640 0 -- 192.168.123.101:0/859765427 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x561534e2f320 msgr2=0x561534e64390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.057 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.061 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.062 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.84 2026-03-24T17:09:37.136 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.129+0000 7f5cf1d10640 0 -- 192.168.123.101:0/2523312607 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x563bb7071320 msgr2=0x563bb70a6390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.136 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.140 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.140 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.85 2026-03-24T17:09:37.203 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.197+0000 7ff7e39af640 0 -- 192.168.123.101:0/448333469 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x557ac201a320 msgr2=0x557ac1ff8790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.206 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.209 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.86 2026-03-24T17:09:37.269 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.261+0000 7f47708a6640 0 -- 192.168.123.101:0/4008598106 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x556685a3bdc0 msgr2=0x556685b1f9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.271 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.274 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.274 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.87 2026-03-24T17:09:37.344 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.337+0000 7fb1e0d90640 0 -- 192.168.123.101:0/3465624107 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fb1b8012e10 msgr2=0x7fb1b8013280 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.347 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.351 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.351 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.88 2026-03-24T17:09:37.418 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.409+0000 7fa032dec640 0 -- 192.168.123.101:0/2183360092 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55dde6845db0 msgr2=0x55dde6866230 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.421 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.425 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.425 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.89 2026-03-24T17:09:37.509 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.501+0000 7fd75f10a640 0 -- 192.168.123.101:0/1851239962 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c2f01e7320 msgr2=0x55c2f01c5800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:37.522 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.527 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.527 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.90 2026-03-24T17:09:37.596 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.589+0000 7fdffab01640 0 -- 192.168.123.101:0/2662234060 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55debf920150 msgr2=0x55debf8d9ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.600 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.604 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.604 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.91 2026-03-24T17:09:37.673 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.665+0000 7fdb0b220640 0 -- 192.168.123.101:0/2318638028 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x559eae472fc0 msgr2=0x559eae493440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.676 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.681 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.681 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.92 2026-03-24T17:09:37.746 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.741+0000 7f022022c640 0 -- 192.168.123.101:0/1758843409 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f020005bcf0 msgr2=0x7f020007c0f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.752 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.756 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.756 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.93 2026-03-24T17:09:37.831 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.825+0000 7f31832d9640 0 -- 192.168.123.101:0/1445213839 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559c90487320 msgr2=0x559c90465820 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.833 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.837 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.837 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.94 2026-03-24T17:09:37.901 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.893+0000 7ff53efcc640 0 -- 192.168.123.101:0/3070555232 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff518003840 msgr2=0x7ff518002cf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.905 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.909 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.910 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.95 2026-03-24T17:09:37.977 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:37.969+0000 7fe732442640 0 -- 192.168.123.101:0/771867636 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55a9f6115320 msgr2=0x55a9f60f3800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:37.977 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:37.981 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:37.981 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.96 2026-03-24T17:09:38.051 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:38.045+0000 7fbdc8f1a640 0 -- 192.168.123.101:0/1318324512 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fbda0012e20 msgr2=0x7fbda0013290 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:38.055 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:38.059 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:38.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.97 2026-03-24T17:09:38.130 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:38.125+0000 7f135a258640 0 -- 192.168.123.101:0/3353511651 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55ddbfdce320 msgr2=0x55ddbfe03390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:38.131 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:38.135 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:38.135 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.98 2026-03-24T17:09:38.205 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:38.197+0000 7fc1348fe640 0 -- 192.168.123.101:0/3628295645 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x563809318ab0 msgr2=0x563809338f30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:38.210 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:38.213 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in $(seq -w 00 99) 2026-03-24T17:09:38.214 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm image.99 2026-03-24T17:09:38.280 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:38.283 INFO:tasks.workunit.client.0.vm01.stdout:testing remove... 2026-03-24T17:09:38.283 INFO:tasks.workunit.client.0.vm01.stderr:+ test_remove 2026-03-24T17:09:38.283 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing remove...' 2026-03-24T17:09:38.283 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:09:38.283 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.353 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.419 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.486 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.552 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.637 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.704 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.769 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.833 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.898 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:38.964 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.029 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.092 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.156 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.220 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.287 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.350 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.413 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:39.473 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove NOT_EXIST 2026-03-24T17:09:39.509 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:09:39.509 INFO:tasks.workunit.client.0.vm01.stderr:rbd: delete error: (2) No such file or directory 2026-03-24T17:09:39.513 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:09:39.513 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T17:09:39.530 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:09:39.537 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:39.529+0000 7f4955100200 -1 librbd: Forced V1 image creation. 2026-03-24T17:09:39.545 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:09:39.580 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:39.583 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:39.583 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:39.584 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:39.609 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:39.609 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:09:39.650 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:09:39.718 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:39.709+0000 7f9c7e296640 0 -- 192.168.123.101:0/1867089514 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55db340bfaa0 msgr2=0x55db340dff20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:39.722 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:39.726 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:39.726 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:39.726 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:39.752 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:39.752 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T17:09:39.768 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:09:39.775 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:39.769+0000 7f513be7e200 -1 librbd: Forced V1 image creation. 2026-03-24T17:09:39.784 INFO:tasks.workunit.client.0.vm01.stderr:+ rados rm -p rbd test1.rbd 2026-03-24T17:09:39.809 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:09:39.836 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:39.840 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:39.840 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:39.840 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:39.866 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:39.867 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 -eq 0 ']' 2026-03-24T17:09:39.867 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:09:39.904 INFO:tasks.workunit.client.0.vm01.stderr:++ rados -p rbd ls 2026-03-24T17:09:39.904 INFO:tasks.workunit.client.0.vm01.stderr:++ grep '^rbd_header' 2026-03-24T17:09:39.932 INFO:tasks.workunit.client.0.vm01.stderr:+ HEADER=rbd_header.18a5f8bf3f04 2026-03-24T17:09:39.932 INFO:tasks.workunit.client.0.vm01.stderr:+ rados -p rbd rm rbd_header.18a5f8bf3f04 2026-03-24T17:09:39.958 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:09:40.000 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:39.993+0000 7f360cff2640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.004 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:39.997+0000 7f360cff2640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.013 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:40.005+0000 7f360d7f3640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.019 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:40.023 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:40.023 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:40.023 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:40.048 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:40.048 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:09:40.083 INFO:tasks.workunit.client.0.vm01.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-24T17:09:40.107 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:09:40.173 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:40.165+0000 7fb73b2c2640 0 -- 192.168.123.101:0/2974804382 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fb718003370 msgr2=0x7fb718002a70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:40.176 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:40.180 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:40.180 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:40.180 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:40.204 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:40.204 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:09:40.240 INFO:tasks.workunit.client.0.vm01.stderr:++ rados -p rbd ls 2026-03-24T17:09:40.240 INFO:tasks.workunit.client.0.vm01.stderr:++ grep '^rbd_header' 2026-03-24T17:09:40.267 INFO:tasks.workunit.client.0.vm01.stderr:+ HEADER=rbd_header.18c3f16cc607 2026-03-24T17:09:40.267 INFO:tasks.workunit.client.0.vm01.stderr:+ rados -p rbd rm rbd_header.18c3f16cc607 2026-03-24T17:09:40.292 INFO:tasks.workunit.client.0.vm01.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-24T17:09:40.316 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:09:40.343 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:40.337+0000 7f5aaffff640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.344 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:40.337+0000 7f5abc866640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.353 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:40.345+0000 7f5abc866640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T17:09:40.360 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:40.365 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:40.365 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:40.365 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:40.388 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:40.388 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:09:40.423 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test2@snap 2026-03-24T17:09:40.983 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:09:40.991 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect test2@snap 2026-03-24T17:09:41.028 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test2@snap clone --rbd-default-clone-format 1 2026-03-24T17:09:41.076 INFO:tasks.workunit.client.0.vm01.stderr:+ rados -p rbd rm rbd_children 2026-03-24T17:09:41.101 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm clone 2026-03-24T17:09:41.163 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:41.157+0000 7fd96b949640 0 -- 192.168.123.101:0/3167689377 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fd9580023a0 msgr2=0x555c25257e50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:41.167 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:41.171 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:41.171 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone 2026-03-24T17:09:41.171 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:41.171 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:41.195 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:41.195 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect test2@snap 2026-03-24T17:09:41.232 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm test2@snap 2026-03-24T17:09:41.982 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:09:41.989 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:09:42.054 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:42.045+0000 7f518e9b5640 0 -- 192.168.123.101:0/3652364947 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55b6b400d320 msgr2=0x55b6b3feb790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:42.258 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:42.262 INFO:tasks.workunit.client.0.vm01.stdout:testing migration... 2026-03-24T17:09:42.263 INFO:tasks.workunit.client.0.vm01.stderr:+ test_migration 2026-03-24T17:09:42.263 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing migration...' 2026-03-24T17:09:42.263 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:09:42.263 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.328 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.393 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.461 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.525 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.591 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.658 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.724 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.788 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.855 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.920 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:42.990 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.056 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.128 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.196 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.263 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.333 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.404 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:09:43.476 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:09:44.048 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:09:44.062 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:09:47.012 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 1 -s 128M test1 2026-03-24T17:09:47.033 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:09:47.042 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:47.033+0000 7f47c4f62200 -1 librbd: Forced V1 image creation. 2026-03-24T17:09:47.050 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:09:47.050 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'format: 1' 2026-03-24T17:09:47.079 INFO:tasks.workunit.client.0.vm01.stdout: format: 1 2026-03-24T17:09:47.079 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 --image-format 2 2026-03-24T17:09:47.138 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state test1 2026-03-24T17:09:47.138 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=test1 2026-03-24T17:09:47.138 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status test1 2026-03-24T17:09:47.138 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:47.193 INFO:tasks.workunit.client.0.vm01.stderr:+ test prepared = prepared 2026-03-24T17:09:47.193 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:09:47.193 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'format: 2' 2026-03-24T17:09:47.223 INFO:tasks.workunit.client.0.vm01.stdout: format: 2 2026-03-24T17:09:47.223 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:09:47.254 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:47.245+0000 7f84eae6a200 -1 librbd::image::PreRemoveRequest: 0x561519b20680 validate_image_removal: image in migration state - not removing 2026-03-24T17:09:47.255 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:09:47.255 INFO:tasks.workunit.client.0.vm01.stderr:rbd: error: image still has watchers 2026-03-24T17:09:47.255 INFO:tasks.workunit.client.0.vm01.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-24T17:09:47.260 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:09:47.260 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:47.322 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:47.317+0000 7fb8a77fe640 0 -- 192.168.123.101:0/1755308526 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fb88803ac90 msgr2=0x7fb88805b120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.328 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:47.321+0000 7fb8ad494640 0 -- 192.168.123.101:0/1755308526 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x559f425c5b60 msgr2=0x559f42655e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.534 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:47.538 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state test1 2026-03-24T17:09:47.538 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=test1 2026-03-24T17:09:47.538 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status test1 2026-03-24T17:09:47.538 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:47.582 INFO:tasks.workunit.client.0.vm01.stderr:+ test executed = executed 2026-03-24T17:09:47.582 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:09:47.647 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete...2026-03-24T17:09:47.641+0000 7fefa8f8c640 0 -- 192.168.123.101:0/3473643535 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5648e000e160 msgr2=0x5648e0150080 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.657 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete...2026-03-24T17:09:47.649+0000 7fefa8f8c640 0 -- 192.168.123.101:0/3473643535 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fef8805c4a0 msgr2=0x7fef8807c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:47.665 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T17:09:47.670 INFO:tasks.workunit.client.0.vm01.stderr:+ get_migration_state test1 2026-03-24T17:09:47.670 INFO:tasks.workunit.client.0.vm01.stderr:+ local image=test1 2026-03-24T17:09:47.670 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --format xml status test1 2026-03-24T17:09:47.670 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:47.701 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:09:47.701 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:09:47.701 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'features: .*layering' 2026-03-24T17:09:47.732 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:09:47.732 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 --image-feature layering,exclusive-lock,object-map,fast-diff,deep-flatten 2026-03-24T17:09:47.805 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:09:47.805 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'features: .*layering' 2026-03-24T17:09:47.839 INFO:tasks.workunit.client.0.vm01.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, migrating 2026-03-24T17:09:47.839 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:47.891 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:47.885+0000 7f1b9a709640 0 -- 192.168.123.101:0/1244938908 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c31004db60 msgr2=0x55c31018c930 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.899 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:47.893+0000 7f1b9a709640 0 -- 192.168.123.101:0/1244938908 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f1b7805c5d0 msgr2=0x7f1b7807c9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.900 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:47.904 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:09:47.972 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:47.965+0000 7f219e90c640 0 -- 192.168.123.101:0/444433119 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x56323dc1e580 msgr2=0x56323dcb17e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:47.980 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete...2026-03-24T17:09:47.973+0000 7f219ce82640 0 -- 192.168.123.101:0/444433119 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f217803d200 msgr2=0x7f217803eb50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.003 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T17:09:48.008 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 rbd2/test1 2026-03-24T17:09:48.079 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:48.073+0000 7f3d63d4c640 0 -- 192.168.123.101:0/1382016425 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f3d40004a30 msgr2=0x7f3d40024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:48.086 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state rbd2/test1 2026-03-24T17:09:48.086 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=rbd2/test1 2026-03-24T17:09:48.087 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status rbd2/test1 2026-03-24T17:09:48.087 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:48.149 INFO:tasks.workunit.client.0.vm01.stderr:+ test prepared = prepared 2026-03-24T17:09:48.149 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:09:48.149 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:09:48.149 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:09:48.174 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:09:48.175 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p rbd2 ls 2026-03-24T17:09:48.175 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:09:48.199 INFO:tasks.workunit.client.0.vm01.stdout:test1 2026-03-24T17:09:48.199 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:48.260 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:48.253+0000 7fbbaac27640 0 -- 192.168.123.101:0/3418278994 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fbb840068f0 msgr2=0x7fbb84026cd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.267 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:48.272 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state rbd2/test1 2026-03-24T17:09:48.272 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=rbd2/test1 2026-03-24T17:09:48.272 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status rbd2/test1 2026-03-24T17:09:48.272 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:48.321 INFO:tasks.workunit.client.0.vm01.stderr:+ test executed = executed 2026-03-24T17:09:48.321 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd2/test1 2026-03-24T17:09:48.349 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:48.341+0000 7f86aaffd640 -1 librbd::image::PreRemoveRequest: 0x55a2d0f404e0 validate_image_removal: image in migration state - not removing 2026-03-24T17:09:48.353 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:09:48.353 INFO:tasks.workunit.client.0.vm01.stderr:rbd: error: image still has watchers 2026-03-24T17:09:48.353 INFO:tasks.workunit.client.0.vm01.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-24T17:09:48.357 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:09:48.357 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:09:48.415 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-24T17:09:48.409+0000 7fa1ef524640 0 -- 192.168.123.101:0/1391874333 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x56365a74b580 msgr2=0x56365a7de510 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.423 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 100% complete...done. 2026-03-24T17:09:48.427 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T17:09:48.456 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd2/ns2 2026-03-24T17:09:48.485 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare rbd2/test1 rbd2/ns1/test1 2026-03-24T17:09:48.551 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:48.545+0000 7fccf95bb640 0 -- 192.168.123.101:0/2775615022 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55f5ecc099b0 msgr2=0x55f5ecc29d90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.559 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-24T17:09:48.559 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=rbd2/ns1/test1 2026-03-24T17:09:48.559 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-24T17:09:48.559 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:48.617 INFO:tasks.workunit.client.0.vm01.stderr:+ test prepared = prepared 2026-03-24T17:09:48.617 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute rbd2/test1 2026-03-24T17:09:48.667 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:48.661+0000 7f767aafd640 0 -- 192.168.123.101:0/1582704527 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55a5d1291f30 msgr2=0x55a5d13d11f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.673 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:48.677 INFO:tasks.workunit.client.0.vm01.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-24T17:09:48.677 INFO:tasks.workunit.client.0.vm01.stderr:++ local image=rbd2/ns1/test1 2026-03-24T17:09:48.678 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-24T17:09:48.678 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T17:09:48.733 INFO:tasks.workunit.client.0.vm01.stderr:+ test executed = executed 2026-03-24T17:09:48.734 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit rbd2/test1 2026-03-24T17:09:48.797 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-24T17:09:48.789+0000 7efcfa032640 0 -- 192.168.123.101:0/2599920962 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x557d53fa9580 msgr2=0x557d5403b830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.804 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 100% complete...done. 2026-03-24T17:09:48.808 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare rbd2/ns1/test1 rbd2/ns2/test1 2026-03-24T17:09:48.869 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:48.861+0000 7f32eb928640 0 -- 192.168.123.101:0/3555171949 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x563868cd6af0 msgr2=0x563868cf6ed0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:48.877 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute rbd2/ns2/test1 2026-03-24T17:09:48.924 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:48.917+0000 7fe6b7fff640 0 -- 192.168.123.101:0/1673818640 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x560c656ac620 msgr2=0x560c656cca00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.932 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:48.925+0000 7fe6b7fff640 0 -- 192.168.123.101:0/1673818640 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fe69805c500 msgr2=0x7fe69807c900 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:48.934 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:48.938 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit rbd2/ns2/test1 2026-03-24T17:09:49.083 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.077+0000 7f8ae71e7640 0 -- 192.168.123.101:0/1155299950 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x556205e1c970 msgr2=0x556205e3cd50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.089 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.081+0000 7f8ae69e6640 0 -- 192.168.123.101:0/1155299950 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f8ac400dd90 msgr2=0x7f8ac400e230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.302 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T17:09:49.306 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M test1 2026-03-24T17:09:49.330 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.321+0000 7f92f4b57200 -1 librbd: Forced V1 image creation. 2026-03-24T17:09:49.337 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 --data-pool rbd2 2026-03-24T17:09:49.397 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:09:49.397 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'data_pool: rbd2' 2026-03-24T17:09:49.429 INFO:tasks.workunit.client.0.vm01.stdout: data_pool: rbd2 2026-03-24T17:09:49.429 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:49.482 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:09:49.473+0000 7f26c1f3f640 0 -- 192.168.123.101:0/2786162968 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55c7e7c49b60 msgr2=0x55c7e7cdc0f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.486 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.481+0000 7f26c1f3f640 0 -- 192.168.123.101:0/2786162968 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f26a005c570 msgr2=0x7f26a007c970 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.491 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:49.495 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:09:49.558 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete...2026-03-24T17:09:49.549+0000 7fde19f0e640 0 -- 192.168.123.101:0/886107838 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55cd2ebe1aa0 msgr2=0x55cd2ec4ffc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.582 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete...2026-03-24T17:09:49.573+0000 7fde19f0e640 0 -- 192.168.123.101:0/886107838 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fddf805c4a0 msgr2=0x7fddf807c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:49.592 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T17:09:49.596 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 2026-03-24T17:09:49.664 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash mv test1 2026-03-24T17:09:49.664 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv test1 2026-03-24T17:09:49.695 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.689+0000 7f2541612200 -1 librbd::api::Trash: move: cannot move migrating image to trash 2026-03-24T17:09:49.697 INFO:tasks.workunit.client.0.vm01.stderr:rbd: deferred delete error: (16) Device or resource busy 2026-03-24T17:09:49.701 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:09:49.702 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls -a 2026-03-24T17:09:49.702 INFO:tasks.workunit.client.0.vm01.stderr:++ cut -d ' ' -f 1 2026-03-24T17:09:49.732 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=19c87b6f541 2026-03-24T17:09:49.732 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash rm 19c87b6f541 2026-03-24T17:09:49.732 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm 19c87b6f541 2026-03-24T17:09:49.766 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.757+0000 7f626145c640 -1 librbd::image::RefreshRequest: image being migrated 2026-03-24T17:09:49.766 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.757+0000 7f6260c5b640 -1 librbd::image::OpenRequest: failed to refresh image: (30) Read-only file system 2026-03-24T17:09:49.766 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.757+0000 7f626145c640 -1 librbd::ImageState: 0x7f624003c0e0 failed to open image: (30) Read-only file system 2026-03-24T17:09:49.766 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:49.761+0000 7f623b7fe640 -1 librbd::image::RemoveRequest: 0x7f6240000b90 handle_open_image: error opening image: (30) Read-only file system 2026-03-24T17:09:49.768 INFO:tasks.workunit.client.0.vm01.stderr:rbd: remove error: (30) Read-only file system 2026-03-24T17:09:49.768 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:09:49.772 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:09:49.772 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash restore 19c87b6f541 2026-03-24T17:09:49.772 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash restore 19c87b6f541 2026-03-24T17:09:49.796 INFO:tasks.workunit.client.0.vm01.stderr:rbd: restore error: (22) Invalid argument2026-03-24T17:09:49.789+0000 7f72d4704200 -1 librbd::api::Trash: restore: Current trash source 'migration' does not match expected: user,mirroring,unknown (4) 2026-03-24T17:09:49.796 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:09:49.800 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:09:49.800 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test1 2026-03-24T17:09:49.858 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete...2026-03-24T17:09:49.849+0000 7f1c6ca8e640 0 -- 192.168.123.101:0/3572814751 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f1c48004930 msgr2=0x7f1c48004dd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.868 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:09:49.861+0000 7f1c6e518640 0 -- 192.168.123.101:0/3572814751 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f1c4c05c570 msgr2=0x7f1c4c07c970 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:09:49.879 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:09:49.883 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove test1 2026-03-24T17:09:49.949 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:09:49.941+0000 7fef144d2640 0 -- 192.168.123.101:0/66022964 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7feef405c4a0 msgr2=0x7feef407c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:49.949 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:09:49.953 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/dev/urandom bs=1M count=1 2026-03-24T17:09:49.954 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd --image-format 2 import - test1 2026-03-24T17:09:49.993 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records in 2026-03-24T17:09:49.993 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records out 2026-03-24T17:09:49.993 INFO:tasks.workunit.client.0.vm01.stderr:1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0381787 s, 27.5 MB/s 2026-03-24T17:09:50.028 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 100% complete...done. 2026-03-24T17:09:50.032 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export test1 - 2026-03-24T17:09:50.032 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:09:50.068 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:09:50.072 INFO:tasks.workunit.client.0.vm01.stderr:+ md5sum='02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:09:50.072 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@snap1 2026-03-24T17:09:51.165 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:09:51.216 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect test1@snap1 2026-03-24T17:09:51.253 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@snap2 2026-03-24T17:09:51.794 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:09:51.801 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@snap1 clone_v1 --rbd_default_clone_format=1 2026-03-24T17:09:51.853 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@snap2 clone_v2 --rbd_default_clone_format=2 2026-03-24T17:09:51.900 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v1 2026-03-24T17:09:51.900 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-24T17:09:51.934 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/test1@snap1 2026-03-24T17:09:51.935 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:51.935 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-24T17:09:51.969 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/test1@snap2 2026-03-24T17:09:51.969 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:51.970 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'op_features: clone-child' 2026-03-24T17:09:52.004 INFO:tasks.workunit.client.0.vm01.stdout: op_features: clone-child 2026-03-24T17:09:52.005 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v1 - 2026-03-24T17:09:52.005 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:09:52.040 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:09:52.044 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:09:52.045 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v2 - 2026-03-24T17:09:52.045 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:09:52.078 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:09:52.082 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:09:52.083 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children test1@snap1 2026-03-24T17:09:52.121 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T17:09:52.121 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children test1@snap2 2026-03-24T17:09:52.357 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T17:09:52.358 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test1 rbd2/test2 2026-03-24T17:09:52.426 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:52.417+0000 7f3ee369f640 0 -- 192.168.123.101:0/34358743 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f3ebc008d30 msgr2=0x7f3ebc0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:54.163 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:54.157+0000 7f3ee369f640 0 -- 192.168.123.101:0/34358743 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x559f3b2b1950 msgr2=0x559f3b3707a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:54.219 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v1 2026-03-24T17:09:54.220 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd2/test2@snap1' 2026-03-24T17:09:54.265 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd2/test2@snap1 2026-03-24T17:09:54.266 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:54.266 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd2/test2@snap2' 2026-03-24T17:09:54.313 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd2/test2@snap2 2026-03-24T17:09:54.313 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:54.313 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'op_features: clone-child' 2026-03-24T17:09:54.358 INFO:tasks.workunit.client.0.vm01.stdout: op_features: clone-child 2026-03-24T17:09:54.358 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children rbd2/test2@snap1 2026-03-24T17:09:54.403 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T17:09:54.403 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children rbd2/test2@snap2 2026-03-24T17:09:54.442 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T17:09:54.442 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:54.527 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:09:54.531 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd migration commit test1 2026-03-24T17:09:54.531 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:09:54.586 INFO:tasks.workunit.client.0.vm01.stderr:rbd: the image has descendants: 2026-03-24T17:09:54.586 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v1 2026-03-24T17:09:54.586 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v2 2026-03-24T17:09:54.586 INFO:tasks.workunit.client.0.vm01.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T17:09:54.586 INFO:tasks.workunit.client.0.vm01.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-24T17:09:54.590 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:09:54.590 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 --force 2026-03-24T17:09:54.643 INFO:tasks.workunit.client.0.vm01.stderr:rbd: the image has descendants: 2026-03-24T17:09:54.643 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v1 2026-03-24T17:09:54.643 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v2 2026-03-24T17:09:54.643 INFO:tasks.workunit.client.0.vm01.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T17:09:54.643 INFO:tasks.workunit.client.0.vm01.stderr:Proceeding anyway due to force flag set. 2026-03-24T17:09:54.646 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:54.641+0000 7f890ecfa640 0 -- 192.168.123.101:0/769029502 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f88e800a8f0 msgr2=0x7f88e802ad70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:54.651 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:54.645+0000 7f890ecfa640 0 -- 192.168.123.101:0/769029502 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x562605403c40 msgr2=0x7f88f009d660 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:56.204 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 100% complete...done. 2026-03-24T17:09:56.209 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v1 - 2026-03-24T17:09:56.209 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:09:56.244 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:09:56.249 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:09:56.250 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v2 - 2026-03-24T17:09:56.250 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:09:56.284 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:09:56.289 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:09:56.289 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare rbd2/test2 test1 2026-03-24T17:09:56.352 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:56.345+0000 7fe4c7d52640 0 -- 192.168.123.101:0/1460361420 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fe4a4004a30 msgr2=0x7fe4a4024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:57.789 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:57.781+0000 7fe4c7d52640 0 -- 192.168.123.101:0/1460361420 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x55e57c4a4720 msgr2=0x55e57c3eaee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:09:58.046 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-24T17:09:58.046 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v1 2026-03-24T17:09:58.089 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/test1@snap1 2026-03-24T17:09:58.089 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:58.089 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-24T17:09:58.131 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/test1@snap2 2026-03-24T17:09:58.131 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info clone_v2 2026-03-24T17:09:58.131 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 'op_features: clone-child' 2026-03-24T17:09:58.200 INFO:tasks.workunit.client.0.vm01.stdout: op_features: clone-child 2026-03-24T17:09:58.200 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children test1@snap1 2026-03-24T17:09:58.245 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T17:09:58.245 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd children test1@snap2 2026-03-24T17:09:58.287 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T17:09:58.287 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test1 2026-03-24T17:09:58.378 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:09:58.369+0000 7f6eed589640 0 -- 192.168.123.101:0/2045702017 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f6ec4008d30 msgr2=0x7f6ec40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:10:43.317 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:10:43.314+0000 7f6eecd88640 0 -- 192.168.123.101:0/2045702017 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f6ec4076e20 msgr2=0x7f6ec40972a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:18.374 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:18:18.377 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd migration commit test1 2026-03-24T17:18:18.377 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:18.418+0000 7f5e3e283640 0 -- 192.168.123.101:0/1038650525 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55d80d6af580 msgr2=0x55d80d741e60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr:rbd: the image has descendants: 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v1 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v2 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T17:18:18.423 INFO:tasks.workunit.client.0.vm01.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-24T17:18:18.426 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:18.426 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration commit test1 --force 2026-03-24T17:18:18.472 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:18.466+0000 7f2df4a7f640 0 -- 192.168.123.101:0/19265619 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f2dcc008d70 msgr2=0x7f2dcc0291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:18:18.474 INFO:tasks.workunit.client.0.vm01.stderr:rbd: the image has descendants: 2026-03-24T17:18:18.474 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v1 2026-03-24T17:18:18.474 INFO:tasks.workunit.client.0.vm01.stderr: rbd/clone_v2 2026-03-24T17:18:18.474 INFO:tasks.workunit.client.0.vm01.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T17:18:18.474 INFO:tasks.workunit.client.0.vm01.stderr:Proceeding anyway due to force flag set. 2026-03-24T17:18:18.477 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:18.470+0000 7f2df4a7f640 0 -- 192.168.123.101:0/19265619 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x558236830ab0 msgr2=0x7f2dd409d5f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:18:20.087 INFO:tasks.workunit.client.0.vm01.stderr: Commit image migration: 100% complete...done. 2026-03-24T17:18:20.091 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v1 - 2026-03-24T17:18:20.091 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:18:20.174 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:18:20.178 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:18:20.179 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export clone_v2 - 2026-03-24T17:18:20.179 INFO:tasks.workunit.client.0.vm01.stderr:++ md5sum 2026-03-24T17:18:20.211 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:18:20.215 INFO:tasks.workunit.client.0.vm01.stderr:+ test '02da09de6313ea7a3ec0a33da674ede3 -' = '02da09de6313ea7a3ec0a33da674ede3 -' 2026-03-24T17:18:20.215 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove clone_v1 2026-03-24T17:18:20.280 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:20.274+0000 7ff78200f640 0 -- 192.168.123.101:0/3595213506 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff760005830 msgr2=0x7ff7600073a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:20.283 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:20.287 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove clone_v2 2026-03-24T17:18:20.346 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:20.338+0000 7f0a9bfff640 0 -- 192.168.123.101:0/3166337393 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f0a78008d30 msgr2=0x7f0a780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:20.351 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:20.354 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect test1@snap1 2026-03-24T17:18:20.388 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge test1 2026-03-24T17:18:22.082 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:18:22.088 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:18:22.158 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.150+0000 7f3625189640 0 -- 192.168.123.101:0/1283774691 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55c009375f60 msgr2=0x55c0093963e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:22.166 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:22.169 INFO:tasks.workunit.client.0.vm01.stderr:+ for format in 1 2 2026-03-24T17:18:22.169 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T17:18:22.184 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:18:22.190 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.186+0000 7f55d22f0200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:22.197 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T17:18:22.246 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:22.274 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.276 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.277 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T17:18:22.281 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:22.328 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:22.322+0000 7fda68afb640 0 -- 192.168.123.101:0/2652211858 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fda44004a40 msgr2=0x7fda44025440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:22.341 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:22.345 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:22.371 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.373 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.373 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T17:18:22.377 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:22.415 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:22.418 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T17:18:22.437 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:18:22.446 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.442+0000 7fc20d566200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:22.453 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T17:18:22.504 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:22.532 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.535 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.535 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:22.539 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test2 2026-03-24T17:18:22.589 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:18:22.582+0000 7f8f64f08640 0 -- 192.168.123.101:0/3851073248 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x55a1614ffe40 msgr2=0x55a161640d30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:18:22.593 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.586+0000 7f8f64f08640 0 -- 192.168.123.101:0/3851073248 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f8f4405c5d0 msgr2=0x7f8f4407c9d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:22.597 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:18:22.600 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:22.644 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:22.638+0000 7f691ffff640 0 -- 192.168.123.101:0/2123695061 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f6904008d30 msgr2=0x7f69040291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:22.656 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:22.660 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:22.685 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.687 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.687 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:22.691 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:22.731 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:22.734 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T17:18:22.749 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:18:22.755 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.750+0000 7f67ccad4200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:22.762 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-24T17:18:22.792 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.786+0000 7f72895f4200 -1 librbd::image::CreateRequest: 0x55fde1f0d740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-24T17:18:22.793 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.786+0000 7f72895f4200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-24T17:18:22.797 INFO:tasks.workunit.client.0.vm01.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-24T17:18:22.800 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:18:22.800 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:22.825 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.827 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.827 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:22.831 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:22.869 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:22.873 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T17:18:22.887 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:18:22.897 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:22.890+0000 7f5042835200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:22.905 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T17:18:22.956 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-24T17:18:22.985 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:22.988 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:22.988 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:22.992 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:23.049 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:23.042+0000 7ffb50ba0640 0 -- 192.168.123.101:0/2864693137 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x5572cac58e90 msgr2=0x5572cace9ba0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.051 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:23.055 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:23.085 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:23.087 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:23.087 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:23.091 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:23.143 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:23.146 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T17:18:23.162 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image format 1 is deprecated 2026-03-24T17:18:23.169 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:23.162+0000 7f4c680dc200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:23.177 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T17:18:23.228 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort rbd2/test2 2026-03-24T17:18:23.282 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:23.274+0000 7f3b537da640 0 -- 192.168.123.101:0/2076421287 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x5589e6825b60 msgr2=0x5589e68b8190 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:18:23.289 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:23.292 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:23.317 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:23.319 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:23.320 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:23.324 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:23.367 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:23.373 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 = 1 2026-03-24T17:18:23.373 INFO:tasks.workunit.client.0.vm01.stderr:+ continue 2026-03-24T17:18:23.373 INFO:tasks.workunit.client.0.vm01.stderr:+ for format in 1 2 2026-03-24T17:18:23.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:23.407 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T17:18:23.468 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:23.462+0000 7f8e92402640 0 -- 192.168.123.101:0/3869207603 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f8e74008d30 msgr2=0x7f8e740291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:18:23.474 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:23.509 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:23.513 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:23.513 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:23.520 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:23.581 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:23.574+0000 7fbeed0fd640 0 -- 192.168.123.101:0/1557915462 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fbec4008d30 msgr2=0x7fbec40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.585 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:23.588 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:23.618 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:23.619 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:23.619 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T17:18:23.625 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:23.687 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:23.682+0000 7f91e36ef640 0 -- 192.168.123.101:0/3150824366 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x555586f8d5b0 msgr2=0x555586f7d4b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.692 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:23.695 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:23.727 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T17:18:23.788 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:23.782+0000 7f700d51d640 0 -- 192.168.123.101:0/2397832069 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x5612ca854f70 msgr2=0x5612ca8e7f70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.800 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:23.832 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:23.835 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:23.835 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:23.841 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration execute test2 2026-03-24T17:18:23.890 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T17:18:23.886+0000 7f97dd7e0640 0 -- 192.168.123.101:0/3570752608 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x55df04a93e40 msgr2=0x55df04bd50a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.898 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:23.894+0000 7f97d77fe640 0 -- 192.168.123.101:0/3570752608 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55df04bd9da0 msgr2=0x55df04bfa180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.898 INFO:tasks.workunit.client.0.vm01.stderr: Image migration: 100% complete...done. 2026-03-24T17:18:23.901 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:23.969 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:23.962+0000 7f7605e7d640 0 -- 192.168.123.101:0/1869982548 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55bce7c87440 msgr2=0x55bce7ca7820 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:23.972 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:23.976 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:24.005 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:24.006 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:24.006 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:24.012 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:24.082 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:24.078+0000 7f642f271640 0 -- 192.168.123.101:0/2124216745 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5603cb35b320 msgr2=0x5603cb390390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.087 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:24.090 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:24.122 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-24T17:18:24.172 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:24.166+0000 7f94d399c200 -1 librbd::image::CreateRequest: 0x563dcb80c740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-24T17:18:24.172 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:24.166+0000 7f94d399c200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-24T17:18:24.188 INFO:tasks.workunit.client.0.vm01.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-24T17:18:24.192 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:18:24.192 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:24.223 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:24.225 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:24.225 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T17:18:24.233 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:24.300 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:24.294+0000 7f9af81e9640 0 -- 192.168.123.101:0/4040529878 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x564f79ccd320 msgr2=0x564f79d02390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.306 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:24.310 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:24.346 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T17:18:24.409 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:24.402+0000 7f9c16da5640 0 -- 192.168.123.101:0/326053234 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x562e6f0d1e60 msgr2=0x562e6f161eb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.416 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-24T17:18:24.448 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:24.451 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:24.451 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:24.456 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:24.524 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-24T17:18:24.528 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:24.559 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:24.561 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:24.561 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T17:18:24.567 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:24.632 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:24.626+0000 7fcdb969e640 0 -- 192.168.123.101:0/2859205936 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5572ef04f320 msgr2=0x5572ef02d790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.638 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:24.642 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:24.674 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T17:18:24.731 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:24.726+0000 7f7bbb907640 0 -- 192.168.123.101:0/552012434 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x55e8a3f45e60 msgr2=0x55e8a3fd6020 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.737 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort rbd2/test2 2026-03-24T17:18:24.794 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:24.786+0000 7f3c32658640 0 -- 192.168.123.101:0/1647483253 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5618f64bc310 msgr2=0x5618f64dc6f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:24.794 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:24.797 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:24.825 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:24.828 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:24.828 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:24.834 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:25.049 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:25.042+0000 7f745ef2c640 0 --2- 192.168.123.101:0/2555478720 >> v2:192.168.123.101:3300/0 conn(0x5613becdd150 0x5613becfd530 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:18:25.090 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:25.086+0000 7f745f72d640 0 -- 192.168.123.101:0/641912602 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x5613bed49460 msgr2=0x5613bed698e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:25.098 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:25.101 INFO:tasks.workunit.client.0.vm01.stderr:+ test 2 = 1 2026-03-24T17:18:25.101 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T17:18:25.136 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration prepare test2 rbd2/ns1/test3 2026-03-24T17:18:25.200 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:25.194+0000 7f8434fcc640 0 -- 192.168.123.101:0/1980980666 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f8410004a40 msgr2=0x7f8410025440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:25.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/ns1/test3 2026-03-24T17:18:25.241 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:25.243 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:25.243 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:25.249 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd migration abort test2 2026-03-24T17:18:25.516 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T17:18:25.510+0000 7f527200e640 0 -- 192.168.123.101:0/2081649418 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f525005c500 msgr2=0x7f525007c900 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:25.516 INFO:tasks.workunit.client.0.vm01.stderr: Abort image migration: 100% complete...done. 2026-03-24T17:18:25.520 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T17:18:25.549 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T17:18:25.550 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:18:25.550 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 0 ops: 1 ops/sec: 250 bytes/sec: 250 KiB/s 2026-03-24T17:18:25.556 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:18:25.620 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:18:25.614+0000 7fd166cee640 0 -- 192.168.123.101:0/1239344042 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5575cc75d360 msgr2=0x5575cc78ef10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:25.623 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:25.627 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:25.627 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.683 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.738 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.793 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.849 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.904 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:25.959 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.014 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.071 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.127 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.183 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.239 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.294 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.350 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.405 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.460 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.516 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.572 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.628 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:18:26.883 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:18:26.895 INFO:tasks.workunit.client.0.vm01.stdout:testing config... 2026-03-24T17:18:26.895 INFO:tasks.workunit.client.0.vm01.stderr:+ test_config 2026-03-24T17:18:26.895 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing config...' 2026-03-24T17:18:26.895 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:26.895 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:26.958 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.012 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.067 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.126 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.184 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.441 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.497 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.553 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.609 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.665 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.722 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.782 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.849 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.909 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:27.966 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:28.023 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:28.282 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:28.339 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global set osd rbd_cache true 2026-03-24T17:18:28.339 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set osd rbd_cache true 2026-03-24T17:18:28.353 INFO:tasks.workunit.client.0.vm01.stderr:rbd: invalid config entity: osd (must be global, client or client.) 2026-03-24T17:18:28.354 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.354 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global set global debug_ms 10 2026-03-24T17:18:28.354 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set global debug_ms 10 2026-03-24T17:18:28.368 INFO:tasks.workunit.client.0.vm01.stderr:rbd: not rbd option: debug_ms 2026-03-24T17:18:28.369 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.369 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global set global rbd_UNKNOWN false 2026-03-24T17:18:28.369 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set global rbd_UNKNOWN false 2026-03-24T17:18:28.383 INFO:tasks.workunit.client.0.vm01.stderr:rbd: invalid config key: rbd_UNKNOWN 2026-03-24T17:18:28.384 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.384 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global set global rbd_cache INVALID 2026-03-24T17:18:28.384 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set global rbd_cache INVALID 2026-03-24T17:18:28.404 INFO:tasks.workunit.client.0.vm01.stderr:rbd: error setting rbd_cache: error parsing value: Expected option value to be integer, got 'INVALID' 2026-03-24T17:18:28.406 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.406 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set global rbd_cache false 2026-03-24T17:18:28.435 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set client rbd_cache true 2026-03-24T17:18:28.459 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global set client.123 rbd_cache false 2026-03-24T17:18:28.484 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^false$' 2026-03-24T17:18:28.484 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global get global rbd_cache 2026-03-24T17:18:28.507 INFO:tasks.workunit.client.0.vm01.stdout:false 2026-03-24T17:18:28.507 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global get client rbd_cache 2026-03-24T17:18:28.507 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^true$' 2026-03-24T17:18:28.530 INFO:tasks.workunit.client.0.vm01.stdout:true 2026-03-24T17:18:28.530 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global get client.123 rbd_cache 2026-03-24T17:18:28.530 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^false$' 2026-03-24T17:18:28.553 INFO:tasks.workunit.client.0.vm01.stdout:false 2026-03-24T17:18:28.553 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global get client.UNKNOWN rbd_cache 2026-03-24T17:18:28.553 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global get client.UNKNOWN rbd_cache 2026-03-24T17:18:28.573 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd_cache is not set 2026-03-24T17:18:28.575 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.575 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global list global 2026-03-24T17:18:28.575 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-24T17:18:28.598 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache false global 2026-03-24T17:18:28.598 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global list client 2026-03-24T17:18:28.598 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-24T17:18:28.621 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true client 2026-03-24T17:18:28.621 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global list client.123 2026-03-24T17:18:28.621 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * false * client.123 *$' 2026-03-24T17:18:28.644 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache false client.123 2026-03-24T17:18:28.644 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global list client.UNKNOWN 2026-03-24T17:18:28.645 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-24T17:18:28.667 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true client 2026-03-24T17:18:28.667 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global rm client rbd_cache 2026-03-24T17:18:28.693 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config global get client rbd_cache 2026-03-24T17:18:28.693 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global get client rbd_cache 2026-03-24T17:18:28.713 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd_cache is not set 2026-03-24T17:18:28.716 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:28.716 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global list client 2026-03-24T17:18:28.716 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-24T17:18:28.738 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache false global 2026-03-24T17:18:28.738 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global rm client.123 rbd_cache 2026-03-24T17:18:28.763 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config global rm global rbd_cache 2026-03-24T17:18:28.789 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool set rbd rbd_cache true 2026-03-24T17:18:28.821 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool list rbd 2026-03-24T17:18:28.821 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T17:18:28.843 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true pool 2026-03-24T17:18:28.843 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool get rbd rbd_cache 2026-03-24T17:18:28.843 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^true$' 2026-03-24T17:18:28.865 INFO:tasks.workunit.client.0.vm01.stdout:true 2026-03-24T17:18:28.865 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 1 test1 2026-03-24T17:18:28.884 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:28.878+0000 7fa14bef1200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:28.892 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image list rbd/test1 2026-03-24T17:18:28.892 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T17:18:28.920 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true pool 2026-03-24T17:18:28.920 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image set rbd/test1 rbd_cache false 2026-03-24T17:18:28.951 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image list rbd/test1 2026-03-24T17:18:28.951 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * false * image *$' 2026-03-24T17:18:28.978 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache false image 2026-03-24T17:18:28.978 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-24T17:18:28.978 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^false$' 2026-03-24T17:18:29.006 INFO:tasks.workunit.client.0.vm01.stdout:false 2026-03-24T17:18:29.006 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image remove rbd/test1 rbd_cache 2026-03-24T17:18:29.036 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config image get rbd/test1 rbd_cache 2026-03-24T17:18:29.036 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-24T17:18:29.060 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd_cache is not set 2026-03-24T17:18:29.064 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:29.064 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config image list rbd/test1 2026-03-24T17:18:29.064 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T17:18:29.091 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true pool 2026-03-24T17:18:29.091 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool remove rbd rbd_cache 2026-03-24T17:18:29.120 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd config pool get rbd rbd_cache 2026-03-24T17:18:29.120 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool get rbd rbd_cache 2026-03-24T17:18:29.140 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd_cache is not set 2026-03-24T17:18:29.142 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:18:29.142 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd config pool list rbd 2026-03-24T17:18:29.142 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^rbd_cache * true * config *$' 2026-03-24T17:18:29.164 INFO:tasks.workunit.client.0.vm01.stdout:rbd_cache true config 2026-03-24T17:18:29.165 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:18:29.194 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stdout:testing import, export, resize, and snapshots... 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ RBD_CREATE_ARGS= 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ test_others 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:29.197 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.253 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.308 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.362 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.419 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.478 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.537 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.594 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.652 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.711 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.768 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.828 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.887 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:29.943 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:30.001 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:30.057 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:30.113 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:30.170 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:30.226 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T17:18:30.227 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-24T17:18:30.227 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records in 2026-03-24T17:18:30.227 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records out 2026-03-24T17:18:30.227 INFO:tasks.workunit.client.0.vm01.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 3.2541e-05 s, 31.5 MB/s 2026-03-24T17:18:30.227 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:10+0 records in 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:10+0 records out 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:10240 bytes (10 kB, 10 KiB) copied, 4.7268e-05 s, 217 MB/s 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records in 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records out 2026-03-24T17:18:30.228 INFO:tasks.workunit.client.0.vm01.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000150782 s, 397 MB/s 2026-03-24T17:18:30.229 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-24T17:18:30.229 INFO:tasks.workunit.client.0.vm01.stderr:134+1 records in 2026-03-24T17:18:30.229 INFO:tasks.workunit.client.0.vm01.stderr:134+1 records out 2026-03-24T17:18:30.229 INFO:tasks.workunit.client.0.vm01.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.000371135 s, 372 MB/s 2026-03-24T17:18:30.230 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-24T17:18:30.230 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records in 2026-03-24T17:18:30.230 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records out 2026-03-24T17:18:30.230 INFO:tasks.workunit.client.0.vm01.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.00015513 s, 386 MB/s 2026-03-24T17:18:30.231 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import /tmp/img1 testimg1 2026-03-24T17:18:30.250 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:30.246+0000 7f381b8c6200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:30.365 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-24T17:18:30.369 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-24T17:18:30.394 INFO:tasks.workunit.client.0.vm01.stderr: Resizing image: 100% complete...done. 2026-03-24T17:18:30.398 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img2 2026-03-24T17:18:30.470 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:30.474 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T17:18:31.438 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:18:31.443 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=128 2026-03-24T17:18:31.466 INFO:tasks.workunit.client.0.vm01.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-24T17:18:31.471 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:18:31.471 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-24T17:18:31.691 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:31.690+0000 7f59f2bb4640 0 --2- 192.168.123.101:0/2753261729 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x5654563e6cf0 0x5654564a7090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:18:31.713 INFO:tasks.workunit.client.0.vm01.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-24T17:18:31.718 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img3 2026-03-24T17:18:31.772 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:31.776 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:18:31.776 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:31.801 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:31.801 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:18:31.801 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:31.825 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:31.825 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-24T17:18:31.826 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-24T17:18:31.860 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:31.864 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-24T17:18:31.889 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:18:31.892 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --size=1 testimg-diff1 2026-03-24T17:18:31.911 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:31.906+0000 7f5482aa1200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:31.917 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-24T17:18:32.835 INFO:tasks.workunit.client.0.vm01.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-24T17:18:32.840 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-24T17:18:32.878 INFO:tasks.workunit.client.0.vm01.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-24T17:18:32.882 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:18:32.882 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:32.906 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:32.907 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:18:32.907 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:32.931 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:32.931 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff1 2026-03-24T17:18:32.931 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:32.955 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:32.956 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T17:18:32.956 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:32.979 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:32.980 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-24T17:18:33.041 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:18:33.046 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg1 testimg3 2026-03-24T17:18:33.101 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:18:33.106 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-24T17:18:33.180 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:18:33.185 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-24T17:18:33.405 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:33.402+0000 7f0026aa1640 0 --2- 192.168.123.101:0/1466011013 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x7f00140030d0 0x7f00140034c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:18:33.453 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:18:33.458 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg2 2026-03-24T17:18:33.459 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:33.484 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:33.485 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:18:33.485 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:33.510 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:33.511 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff2 2026-03-24T17:18:33.511 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:33.537 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:33.537 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff3 2026-03-24T17:18:33.537 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:33.563 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:33.563 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy testimg1 testimg4 2026-03-24T17:18:33.854 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T17:18:33.858 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-24T17:18:34.912 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T17:18:34.916 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg4 2026-03-24T17:18:34.916 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:18:34.942 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:18:34.943 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg5 2026-03-24T17:18:34.943 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:35.169 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:35.170 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg4 2026-03-24T17:18:35.170 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:18:35.170 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:18:35.170 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:18:35.196 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:18:35.196 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg4 2026-03-24T17:18:35.196 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '.*snap1.*' 2026-03-24T17:18:35.222 INFO:tasks.workunit.client.0.vm01.stdout: 11 snap1 256 MiB Tue Mar 24 17:18:33 2026 2026-03-24T17:18:35.222 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-24T17:18:35.298 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.302 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-24T17:18:35.381 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.386 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-24T17:18:35.430 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.435 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-24T17:18:35.488 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.492 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-24T17:18:35.573 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.577 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-24T17:18:35.630 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:35.635 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-24T17:18:36.143 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-24T17:18:36.431 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-24T17:18:36.786 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-24T17:18:36.945 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-24T17:18:37.001 INFO:tasks.workunit.client.0.vm01.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T17:18:37.005 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-24T17:18:37.080 INFO:tasks.workunit.client.0.vm01.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T17:18:37.084 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:18:37.084 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:37.108 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:37.108 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff1 2026-03-24T17:18:37.108 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:18:37.132 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:18:37.132 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-24T17:18:37.245 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:37.249 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-24T17:18:37.326 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:18:37.331 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-24T17:18:37.621 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-24T17:18:37.930 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg2 2026-03-24T17:18:37.995 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete...2026-03-24T17:18:37.990+0000 7f6dd38d1640 0 -- 192.168.123.101:0/3231166067 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x555e29a47320 msgr2=0x555e29a7c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:38.004 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T17:18:37.998+0000 7f6dd38d1640 0 -- 192.168.123.101:0/3231166067 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f6db005bcc0 msgr2=0x7f6db007c0a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:38.007 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:38.010 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg3 2026-03-24T17:18:38.075 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T17:18:38.078 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create testimg2 -s 0 2026-03-24T17:18:38.098 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:38.094+0000 7f1ad18a8200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:38.105 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd cp testimg2 testimg3 2026-03-24T17:18:38.145 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete...done. 2026-03-24T17:18:38.149 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep cp testimg2 testimg6 2026-03-24T17:18:38.188 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 100% complete...done. 2026-03-24T17:18:38.192 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-24T17:18:39.052 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:18:39.057 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-24T17:18:40.057 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:18:40.062 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:18:40.062 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T17:18:40.086 INFO:tasks.workunit.client.0.vm01.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T17:18:40.086 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T17:18:40.086 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T17:18:40.112 INFO:tasks.workunit.client.0.vm01.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T17:18:40.112 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd sparsify testimg1 2026-03-24T17:18:40.167 INFO:tasks.workunit.client.0.vm01.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-24T17:18:40.171 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:40.171 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:40.253 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:40.311 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:40.396 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:41.137 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.150 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.238 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.318 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.422 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.523 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.585 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.646 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.708 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.769 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.830 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.890 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:42.948 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.010 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.071 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T17:18:43.189 INFO:tasks.workunit.client.0.vm01.stdout:testing locking... 2026-03-24T17:18:43.189 INFO:tasks.workunit.client.0.vm01.stderr:+ test_locking 2026-03-24T17:18:43.189 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing locking...' 2026-03-24T17:18:43.189 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:43.189 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.252 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.312 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.373 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.435 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.496 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.759 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.822 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.883 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:43.947 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.012 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.080 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.145 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.211 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.275 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.337 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.399 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.460 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:44.524 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create -s 1 test1 2026-03-24T17:18:44.547 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:44.542+0000 7f92b5ec2200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:44.556 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:44.556 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:18:44.556 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:18:44.581 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:18:44.581 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id 2026-03-24T17:18:44.613 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:44.613 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 1 ' 2026-03-24T17:18:44.639 INFO:tasks.workunit.client.0.vm01.stdout:There is 1 exclusive lock on this image. 2026-03-24T17:18:44.639 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd lock list test1 2026-03-24T17:18:44.639 INFO:tasks.workunit.client.0.vm01.stderr:++ tail -n 1 2026-03-24T17:18:44.639 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{print $1;}' 2026-03-24T17:18:44.664 INFO:tasks.workunit.client.0.vm01.stderr:+ LOCKER=client.7724 2026-03-24T17:18:44.664 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock remove test1 id client.7724 2026-03-24T17:18:45.079 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:45.079 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:18:45.079 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:18:45.105 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:18:45.105 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T17:18:45.137 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:45.137 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 1 ' 2026-03-24T17:18:45.162 INFO:tasks.workunit.client.0.vm01.stdout:There is 1 shared lock on this image. 2026-03-24T17:18:45.163 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T17:18:45.197 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:45.197 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 2 ' 2026-03-24T17:18:45.223 INFO:tasks.workunit.client.0.vm01.stdout:There are 2 shared locks on this image. 2026-03-24T17:18:45.223 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-24T17:18:45.254 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:45.254 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 3 ' 2026-03-24T17:18:45.279 INFO:tasks.workunit.client.0.vm01.stdout:There are 3 shared locks on this image. 2026-03-24T17:18:45.280 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:18:45.280 INFO:tasks.workunit.client.0.vm01.stderr:+ tail -n 1 2026-03-24T17:18:45.280 INFO:tasks.workunit.client.0.vm01.stderr:+ awk '{print $2, $1;}' 2026-03-24T17:18:45.280 INFO:tasks.workunit.client.0.vm01.stderr:+ xargs rbd lock remove test1 2026-03-24T17:18:46.083 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:18:46.083 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -qE 'features:.*exclusive' 2026-03-24T17:18:46.108 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:18:46.139 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:18:46.143 INFO:tasks.workunit.client.0.vm01.stdout:testing thick provision... 2026-03-24T17:18:46.143 INFO:tasks.workunit.client.0.vm01.stderr:+ test_thick_provision 2026-03-24T17:18:46.143 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing thick provision...' 2026-03-24T17:18:46.143 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:18:46.143 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.205 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.268 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.333 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.397 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.459 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.522 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.584 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.645 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.709 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.773 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.903 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:46.965 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:47.032 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:47.096 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:47.160 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:47.225 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:18:47.287 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --thick-provision -s 64M test1 2026-03-24T17:18:47.309 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:47.302+0000 7f70e8ae1200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:47.745 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T17:18:47.750 INFO:tasks.workunit.client.0.vm01.stderr:+ count=0 2026-03-24T17:18:47.750 INFO:tasks.workunit.client.0.vm01.stderr:+ ret= 2026-03-24T17:18:47.750 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T17:18:47.751 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:18:47.751 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:18:47.751 INFO:tasks.workunit.client.0.vm01.stderr:+ cut -d ' ' -f 4-5 2026-03-24T17:18:47.751 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^64 MiB' 2026-03-24T17:18:47.751 INFO:tasks.workunit.client.0.vm01.stderr:+ tr -s ' ' 2026-03-24T17:18:47.774 INFO:tasks.workunit.client.0.vm01.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T17:18:47.780 INFO:tasks.workunit.client.0.vm01.stdout:64 MiB 2026-03-24T17:18:47.780 INFO:tasks.workunit.client.0.vm01.stderr:+ ret=0 2026-03-24T17:18:47.780 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 = 0 ']' 2026-03-24T17:18:47.780 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:18:47.780 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:18:47.801 INFO:tasks.workunit.client.0.vm01.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T17:18:47.803 INFO:tasks.workunit.client.0.vm01.stdout:NAME PROVISIONED USED 2026-03-24T17:18:47.804 INFO:tasks.workunit.client.0.vm01.stdout:test1 64 MiB 64 MiB 2026-03-24T17:18:47.806 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T17:18:47.806 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:18:47.872 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete... Removing image: 100% complete...done. 2026-03-24T17:18:47.876 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:18:47.876 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:18:47.876 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:18:47.876 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:18:47.899 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:18:47.899 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --thick-provision -s 4G test1 2026-03-24T17:18:47.924 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:18:47.918+0000 7f7151be3200 -1 librbd: Forced V1 image creation. 2026-03-24T17:18:50.204 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete... Thick provisioning: 6% complete... Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete...2026-03-24T17:18:50.198+0000 7f7151940640 0 -- 192.168.123.101:0/1899999194 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55dd2bcc85d0 msgr2=0x55dd2bcadad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:18:50.578 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 11% complete... Thick provisioning: 12% complete...2026-03-24T17:18:50.574+0000 7f7151940640 0 -- 192.168.123.101:0/1899999194 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f713005be10 msgr2=0x7f713007c210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.396 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ count=0 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ ret= 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ tr -s ' ' 2026-03-24T17:19:12.403 INFO:tasks.workunit.client.0.vm01.stderr:+ cut -d ' ' -f 4-5 2026-03-24T17:19:12.405 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^4 GiB' 2026-03-24T17:19:12.425 INFO:tasks.workunit.client.0.vm01.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T17:19:12.431 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:12.426+0000 7f7ca568d640 0 -- 192.168.123.101:0/3136874907 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f7c8c002350 msgr2=0x55eecbd00950 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.433 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:12.426+0000 7f7ca6916640 0 -- 192.168.123.101:0/3136874907 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55eecbc4f3b0 msgr2=0x55eecbc902d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.491 INFO:tasks.workunit.client.0.vm01.stdout:4 GiB 2026-03-24T17:19:12.491 INFO:tasks.workunit.client.0.vm01.stderr:+ ret=0 2026-03-24T17:19:12.491 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 = 0 ']' 2026-03-24T17:19:12.491 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:19:12.491 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:19:12.512 INFO:tasks.workunit.client.0.vm01.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T17:19:12.518 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:12.514+0000 7f19afc10640 0 -- 192.168.123.101:0/2660207252 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x564cb0442fe0 msgr2=0x564cb04726c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.519 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:12.514+0000 7f19afc10640 0 -- 192.168.123.101:0/2660207252 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f199005bf30 msgr2=0x7f199007c330 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.575 INFO:tasks.workunit.client.0.vm01.stdout:NAME PROVISIONED USED 2026-03-24T17:19:12.575 INFO:tasks.workunit.client.0.vm01.stdout:test1 4 GiB 4 GiB 2026-03-24T17:19:12.578 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T17:19:12.578 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:19:12.807 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete...2026-03-24T17:19:12.802+0000 7f1c913e0640 0 -- 192.168.123.101:0/4060062190 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x562709034320 msgr2=0x562709069390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:12.828 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 10% complete... Removing image: 11% complete...2026-03-24T17:19:12.822+0000 7f1c8bfff640 0 -- 192.168.123.101:0/4060062190 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f1c6800f2e0 msgr2=0x7f1c6800f780 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:14.718 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T17:19:14.722 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:14.722 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:14.722 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:14.722 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:19:14.743 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stdout:testing import, export, resize, and snapshots... 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ RBD_CREATE_ARGS='--image-format 2' 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ test_others 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:14.744 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:14.805 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:14.866 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:14.929 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:14.991 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.058 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.123 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.185 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.246 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.305 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.366 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.426 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.486 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.546 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.606 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.674 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.739 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.802 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:15.865 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T17:19:15.865 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-24T17:19:15.867 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records in 2026-03-24T17:19:15.867 INFO:tasks.workunit.client.0.vm01.stderr:1+0 records out 2026-03-24T17:19:15.867 INFO:tasks.workunit.client.0.vm01.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 6.2518e-05 s, 16.4 MB/s 2026-03-24T17:19:15.867 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:10+0 records in 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:10+0 records out 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:10240 bytes (10 kB, 10 KiB) copied, 7.2366e-05 s, 142 MB/s 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records in 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records out 2026-03-24T17:19:15.868 INFO:tasks.workunit.client.0.vm01.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000174056 s, 344 MB/s 2026-03-24T17:19:15.869 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-24T17:19:15.871 INFO:tasks.workunit.client.0.vm01.stderr:134+1 records in 2026-03-24T17:19:15.871 INFO:tasks.workunit.client.0.vm01.stderr:134+1 records out 2026-03-24T17:19:15.871 INFO:tasks.workunit.client.0.vm01.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.00162733 s, 84.9 MB/s 2026-03-24T17:19:15.871 INFO:tasks.workunit.client.0.vm01.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-24T17:19:15.872 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records in 2026-03-24T17:19:15.872 INFO:tasks.workunit.client.0.vm01.stderr:58+1 records out 2026-03-24T17:19:15.872 INFO:tasks.workunit.client.0.vm01.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000164067 s, 365 MB/s 2026-03-24T17:19:15.872 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import --image-format 2 /tmp/img1 testimg1 2026-03-24T17:19:15.976 INFO:tasks.workunit.client.0.vm01.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-24T17:19:15.980 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-24T17:19:16.013 INFO:tasks.workunit.client.0.vm01.stderr: Resizing image: 100% complete...done. 2026-03-24T17:19:16.020 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img2 2026-03-24T17:19:16.088 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:16.093 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T17:19:16.312 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:16.310+0000 7fa04a4bc640 0 --2- 192.168.123.101:0/1675296411 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x55f493fbfd80 0x55f494062090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:19:17.233 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:17.239 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=128 2026-03-24T17:19:17.264 INFO:tasks.workunit.client.0.vm01.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-24T17:19:17.268 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:19:17.268 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-24T17:19:17.297 INFO:tasks.workunit.client.0.vm01.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-24T17:19:17.303 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img3 2026-03-24T17:19:17.350 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:17.354 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:19:17.354 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:17.380 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:17.380 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:19:17.381 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:17.407 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:17.407 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-24T17:19:17.409 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-24T17:19:17.439 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:17.442 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-24T17:19:17.468 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 100% complete...done. 2026-03-24T17:19:17.471 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size=1 testimg-diff1 2026-03-24T17:19:17.503 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-24T17:19:18.236 INFO:tasks.workunit.client.0.vm01.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-24T17:19:18.243 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-24T17:19:18.273 INFO:tasks.workunit.client.0.vm01.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-24T17:19:18.280 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:19:18.280 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:18.308 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:18.308 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:19:18.308 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:18.336 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:18.336 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff1 2026-03-24T17:19:18.336 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:18.362 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:18.362 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T17:19:18.362 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:18.389 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:18.389 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-24T17:19:18.444 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:19:18.449 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg1 testimg3 2026-03-24T17:19:18.517 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:19:18.522 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-24T17:19:18.593 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:19:18.598 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-24T17:19:18.680 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T17:19:18.686 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg2 2026-03-24T17:19:18.686 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:18.715 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:18.715 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:19:18.715 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:18.741 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:18.741 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff2 2026-03-24T17:19:18.742 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:18.766 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:18.767 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff3 2026-03-24T17:19:18.767 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:18.793 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:18.793 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy testimg1 testimg4 2026-03-24T17:19:19.261 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T17:19:19.265 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-24T17:19:20.262 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T17:19:20.266 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg4 2026-03-24T17:19:20.266 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 128 MiB' 2026-03-24T17:19:20.293 INFO:tasks.workunit.client.0.vm01.stdout: size 128 MiB in 32 objects 2026-03-24T17:19:20.293 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg5 2026-03-24T17:19:20.293 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:20.319 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:20.319 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg4 2026-03-24T17:19:20.319 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:19:20.319 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:20.319 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:20.346 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:20.346 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg4 2026-03-24T17:19:20.346 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '.*snap1.*' 2026-03-24T17:19:20.373 INFO:tasks.workunit.client.0.vm01.stdout: 15 snap1 256 MiB Tue Mar 24 17:19:19 2026 2026-03-24T17:19:20.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-24T17:19:20.417 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.423 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-24T17:19:20.493 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.499 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-24T17:19:20.547 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.552 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-24T17:19:20.597 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.603 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-24T17:19:20.667 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.673 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-24T17:19:20.720 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:20.727 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-24T17:19:20.867 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-24T17:19:20.935 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-24T17:19:21.044 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-24T17:19:21.099 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-24T17:19:21.148 INFO:tasks.workunit.client.0.vm01.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T17:19:21.155 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-24T17:19:21.202 INFO:tasks.workunit.client.0.vm01.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T17:19:21.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg1 2026-03-24T17:19:21.209 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:21.235 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:21.235 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg-diff1 2026-03-24T17:19:21.235 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:19:21.264 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:19:21.264 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-24T17:19:21.317 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:21.322 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-24T17:19:21.372 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T17:19:21.377 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-24T17:19:21.495 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-24T17:19:21.620 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg2 2026-03-24T17:19:21.792 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T17:19:21.786+0000 7f5bc8160640 0 -- 192.168.123.101:0/2645767987 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5591aabfd320 msgr2=0x5591aac328a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:21.810 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:21.814 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg3 2026-03-24T17:19:21.883 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T17:19:21.878+0000 7f53d2417640 0 -- 192.168.123.101:0/2836477764 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f53b005beb0 msgr2=0x7f53b007c290 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:21.892 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:21.895 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create testimg2 -s 0 2026-03-24T17:19:21.917 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:21.910+0000 7f5a02d60200 -1 librbd: Forced V1 image creation. 2026-03-24T17:19:21.923 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd cp testimg2 testimg3 2026-03-24T17:19:21.956 INFO:tasks.workunit.client.0.vm01.stderr: Image copy: 100% complete...done. 2026-03-24T17:19:21.959 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep cp testimg2 testimg6 2026-03-24T17:19:21.992 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 100% complete...done. 2026-03-24T17:19:21.996 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-24T17:19:22.250 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:22.258 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-24T17:19:23.258 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:23.266 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T17:19:23.266 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T17:19:23.293 INFO:tasks.workunit.client.0.vm01.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T17:19:23.293 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T17:19:23.293 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T17:19:23.321 INFO:tasks.workunit.client.0.vm01.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T17:19:23.322 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd sparsify testimg1 2026-03-24T17:19:23.385 INFO:tasks.workunit.client.0.vm01.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-24T17:19:23.397 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:23.397 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:23.593 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:23.690 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:23.817 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:24.341 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:25.347 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:25.434 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:25.581 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:25.856 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.077 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.140 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.201 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.264 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.325 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.387 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.448 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.508 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.573 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.638 INFO:tasks.workunit.client.0.vm01.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T17:19:26.761 INFO:tasks.workunit.client.0.vm01.stdout:testing locking... 2026-03-24T17:19:26.761 INFO:tasks.workunit.client.0.vm01.stderr:+ test_locking 2026-03-24T17:19:26.761 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing locking...' 2026-03-24T17:19:26.761 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:26.761 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.826 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.888 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:26.947 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.008 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.069 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.218 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.283 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.344 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.408 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.471 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.535 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.598 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.659 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.720 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.779 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:27.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:28.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:28.163 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T17:19:28.197 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:28.197 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:28.197 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:19:28.223 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:28.223 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id 2026-03-24T17:19:28.253 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:28.253 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 1 ' 2026-03-24T17:19:28.280 INFO:tasks.workunit.client.0.vm01.stdout:There is 1 exclusive lock on this image. 2026-03-24T17:19:28.281 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd lock list test1 2026-03-24T17:19:28.281 INFO:tasks.workunit.client.0.vm01.stderr:++ tail -n 1 2026-03-24T17:19:28.281 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{print $1;}' 2026-03-24T17:19:28.307 INFO:tasks.workunit.client.0.vm01.stderr:+ LOCKER=client.8388 2026-03-24T17:19:28.307 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock remove test1 id client.8388 2026-03-24T17:19:29.275 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:29.275 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:29.275 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:19:29.301 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:29.301 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T17:19:29.330 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:29.330 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 1 ' 2026-03-24T17:19:29.355 INFO:tasks.workunit.client.0.vm01.stdout:There is 1 shared lock on this image. 2026-03-24T17:19:29.355 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T17:19:29.385 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:29.385 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 2 ' 2026-03-24T17:19:29.412 INFO:tasks.workunit.client.0.vm01.stdout:There are 2 shared locks on this image. 2026-03-24T17:19:29.412 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-24T17:19:29.443 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:29.443 INFO:tasks.workunit.client.0.vm01.stderr:+ grep ' 3 ' 2026-03-24T17:19:29.468 INFO:tasks.workunit.client.0.vm01.stdout:There are 3 shared locks on this image. 2026-03-24T17:19:29.469 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:29.469 INFO:tasks.workunit.client.0.vm01.stderr:+ tail -n 1 2026-03-24T17:19:29.469 INFO:tasks.workunit.client.0.vm01.stderr:+ awk '{print $2, $1;}' 2026-03-24T17:19:29.469 INFO:tasks.workunit.client.0.vm01.stderr:+ xargs rbd lock remove test1 2026-03-24T17:19:30.278 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info test1 2026-03-24T17:19:30.278 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -qE 'features:.*exclusive' 2026-03-24T17:19:30.303 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd lock list test1 2026-03-24T17:19:30.329 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -n 'There are 2 shared locks on this image. 2026-03-24T17:19:30.329 INFO:tasks.workunit.client.0.vm01.stderr:Lock tag: tag 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:Locker ID Address 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:client.8402 id 192.168.123.101:0/1643403251 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:client.8408 id 192.168.123.101:0/3777821978' ']' 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:+ tail -n 1 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:+ awk '{print $2, $1;}' 2026-03-24T17:19:30.330 INFO:tasks.workunit.client.0.vm01.stderr:+ xargs rbd lock remove test1 2026-03-24T17:19:31.282 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd lock list test1 2026-03-24T17:19:31.307 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -n 'There is 1 shared lock on this image. 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:Lock tag: tag 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:Locker ID Address 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:client.8402 id 192.168.123.101:0/1643403251' ']' 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd lock list test1 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:+ tail -n 1 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:+ awk '{print $2, $1;}' 2026-03-24T17:19:31.308 INFO:tasks.workunit.client.0.vm01.stderr:+ xargs rbd lock remove test1 2026-03-24T17:19:31.846 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd lock list test1 2026-03-24T17:19:31.871 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' -n '' ']' 2026-03-24T17:19:31.871 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:19:31.929 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:31.922+0000 7f586fefa640 0 -- 192.168.123.101:0/2711807451 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x558c9179c150 msgr2=0x558c9177f3c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:31.933 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:31.937 INFO:tasks.workunit.client.0.vm01.stderr:+ test_clone 2026-03-24T17:19:31.937 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing clone...' 2026-03-24T17:19:31.937 INFO:tasks.workunit.client.0.vm01.stdout:testing clone... 2026-03-24T17:19:31.937 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:31.937 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:31.996 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.056 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.114 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.172 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.233 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.298 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.358 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.417 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.478 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.539 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.600 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.660 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.723 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.783 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.845 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.904 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:32.963 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:33.024 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create test1 --image-format 2 -s 1 2026-03-24T17:19:33.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@s1 2026-03-24T17:19:33.289 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:33.297 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect test1@s1 2026-03-24T17:19:33.329 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:19:34.334 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:19:34.348 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:19:37.340 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@s1 rbd2/clone 2026-03-24T17:19:37.381 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p rbd2 ls 2026-03-24T17:19:37.382 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone 2026-03-24T17:19:37.405 INFO:tasks.workunit.client.0.vm01.stdout:clone 2026-03-24T17:19:37.405 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd -p rbd2 ls -l 2026-03-24T17:19:37.405 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone 2026-03-24T17:19:37.405 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1@s1 2026-03-24T17:19:37.435 INFO:tasks.workunit.client.0.vm01.stdout:clone 1 MiB rbd/test1@s1 2 2026-03-24T17:19:37.435 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd ls 2026-03-24T17:19:37.458 INFO:tasks.workunit.client.0.vm01.stderr:+ test test1 = test1 2026-03-24T17:19:37.458 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten rbd2/clone 2026-03-24T17:19:37.488 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 100% complete...done. 2026-03-24T17:19:37.494 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd2/clone@s1 2026-03-24T17:19:38.343 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:38.349 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect rbd2/clone@s1 2026-03-24T17:19:38.379 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/clone@s1 clone2 2026-03-24T17:19:38.424 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:38.424 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone2 2026-03-24T17:19:38.445 INFO:tasks.workunit.client.0.vm01.stdout:clone2 2026-03-24T17:19:38.445 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:38.445 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone2 2026-03-24T17:19:38.445 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/clone@s1 2026-03-24T17:19:38.478 INFO:tasks.workunit.client.0.vm01.stdout:clone2 1 MiB rbd2/clone@s1 2 2026-03-24T17:19:38.479 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd -p rbd2 ls 2026-03-24T17:19:38.501 INFO:tasks.workunit.client.0.vm01.stderr:+ test clone = clone 2026-03-24T17:19:38.501 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/clone clone3 2026-03-24T17:19:38.501 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'snapshot name was not specified' 2026-03-24T17:19:38.515 INFO:tasks.workunit.client.0.vm01.stdout:rbd: snapshot name was not specified 2026-03-24T17:19:38.515 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/clone@invalid clone3 2026-03-24T17:19:38.515 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'failed to open parent image' 2026-03-24T17:19:38.542 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24T17:19:38.534+0000 7f70d97fa640 -1 librbd::image::CloneRequest: 0x556f7d6ab070 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-24T17:19:38.542 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/clone --snap-id 0 clone3 2026-03-24T17:19:38.542 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'failed to open parent image' 2026-03-24T17:19:38.569 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24T17:19:38.562+0000 7f8e527fc640 -1 librbd::image::CloneRequest: 0x56145edc9fd0 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-24T17:19:38.570 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/clone@invalid --snap-id 0 clone3 2026-03-24T17:19:38.570 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'trying to access snapshot using both name and id' 2026-03-24T17:19:38.585 INFO:tasks.workunit.client.0.vm01.stdout:rbd: trying to access snapshot using both name and id. 2026-03-24T17:19:38.585 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd snap ls rbd2/clone --format json 2026-03-24T17:19:38.585 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.[] | select(.name == "s1") | .id' 2026-03-24T17:19:38.613 INFO:tasks.workunit.client.0.vm01.stderr:+ SNAP_ID=3 2026-03-24T17:19:38.613 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --snap-id 3 rbd2/clone clone3 2026-03-24T17:19:38.656 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:38.656 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone3 2026-03-24T17:19:38.680 INFO:tasks.workunit.client.0.vm01.stdout:clone3 2026-03-24T17:19:38.680 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:38.680 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone3 2026-03-24T17:19:38.680 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/clone@s1 2026-03-24T17:19:38.713 INFO:tasks.workunit.client.0.vm01.stdout:clone3 1 MiB rbd2/clone@s1 2 2026-03-24T17:19:38.714 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd -p rbd2 ls 2026-03-24T17:19:38.737 INFO:tasks.workunit.client.0.vm01.stderr:+ test clone = clone 2026-03-24T17:19:38.737 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd ls -l 2026-03-24T17:19:38.737 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c rbd2/clone@s1 2026-03-24T17:19:38.771 INFO:tasks.workunit.client.0.vm01.stderr:+ test 2 = 2 2026-03-24T17:19:38.771 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten clone3 2026-03-24T17:19:38.802 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 100% complete...done. 2026-03-24T17:19:38.808 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd ls -l 2026-03-24T17:19:38.808 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c rbd2/clone@s1 2026-03-24T17:19:38.840 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 = 1 2026-03-24T17:19:38.840 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm clone2 2026-03-24T17:19:38.897 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:38.890+0000 7f50fbc0d640 0 -- 192.168.123.101:0/239848561 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55f7906f7320 msgr2=0x55f79072c390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:38.905 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:38.908 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect rbd2/clone@s1 2026-03-24T17:19:38.938 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm rbd2/clone@s1 2026-03-24T17:19:39.346 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:39.353 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd2/clone 2026-03-24T17:19:39.413 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:39.406+0000 7f7a139b2640 0 -- 192.168.123.101:0/3997355400 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f79ec008d30 msgr2=0x7f79ec0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:39.413 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:39.417 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm clone3 2026-03-24T17:19:39.483 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:39.486 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect test1@s1 2026-03-24T17:19:39.519 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm test1@s1 2026-03-24T17:19:40.536 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:40.545 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:19:40.608 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:40.602+0000 7fc3737fe640 0 -- 192.168.123.101:0/4289028922 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fc350006e90 msgr2=0x7fc350002cf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:19:40.612 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:40.615 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:19:41.589 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:19:41.601 INFO:tasks.workunit.client.0.vm01.stdout:testing trash... 2026-03-24T17:19:41.601 INFO:tasks.workunit.client.0.vm01.stderr:+ test_trash 2026-03-24T17:19:41.601 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing trash...' 2026-03-24T17:19:41.601 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:41.601 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.660 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.720 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.779 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.837 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.897 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:41.956 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.218 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.283 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.344 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.409 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.473 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.537 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.600 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.662 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.725 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.788 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:42.851 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:43.117 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T17:19:43.152 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T17:19:43.187 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.187 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:43.209 INFO:tasks.workunit.client.0.vm01.stdout:test1 2026-03-24T17:19:43.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.209 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T17:19:43.230 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T17:19:43.230 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.230 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.230 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:19:43.253 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:19:43.253 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:43.253 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*2.*' 2026-03-24T17:19:43.284 INFO:tasks.workunit.client.0.vm01.stdout:test1 1 MiB 2 2026-03-24T17:19:43.284 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:43.284 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*2.*' 2026-03-24T17:19:43.316 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 2 2026-03-24T17:19:43.316 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv test1 2026-03-24T17:19:43.360 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.360 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T17:19:43.381 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T17:19:43.382 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.382 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.382 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:43.403 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:43.403 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:43.404 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*2.*' 2026-03-24T17:19:43.430 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 2 2026-03-24T17:19:43.430 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:43.430 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:43.454 INFO:tasks.workunit.client.0.vm01.stdout:22404ca21913 test1 2026-03-24T17:19:43.454 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:43.454 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.454 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:43.477 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:43.477 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:43.477 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*USER.*' 2026-03-24T17:19:43.503 INFO:tasks.workunit.client.0.vm01.stdout:22404ca21913 test1 USER Tue Mar 24 17:19:43 2026 expired at Tue Mar 24 17:19:43 2026 2026-03-24T17:19:43.504 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:43.504 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v 'protected until' 2026-03-24T17:19:43.530 INFO:tasks.workunit.client.0.vm01.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-24T17:19:43.530 INFO:tasks.workunit.client.0.vm01.stdout:22404ca21913 test1 USER Tue Mar 24 17:19:43 2026 expired at Tue Mar 24 17:19:43 2026 2026-03-24T17:19:43.530 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:43.530 INFO:tasks.workunit.client.0.vm01.stderr:++ cut -d ' ' -f 1 2026-03-24T17:19:43.553 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=22404ca21913 2026-03-24T17:19:43.553 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm 22404ca21913 2026-03-24T17:19:43.591 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:43.595 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv test2 2026-03-24T17:19:43.638 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:43.638 INFO:tasks.workunit.client.0.vm01.stderr:++ cut -d ' ' -f 1 2026-03-24T17:19:43.661 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=2243b808fd91 2026-03-24T17:19:43.661 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info --image-id 2243b808fd91 2026-03-24T17:19:43.661 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd image '\''test2'\''' 2026-03-24T17:19:43.686 INFO:tasks.workunit.client.0.vm01.stdout:rbd image 'test2': 2026-03-24T17:19:43.687 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd children --image-id 2243b808fd91 2026-03-24T17:19:43.687 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.687 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:43.712 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:43.713 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash restore 2243b808fd91 2026-03-24T17:19:43.743 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.743 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T17:19:43.765 INFO:tasks.workunit.client.0.vm01.stdout:test2 2026-03-24T17:19:43.765 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:19:43.765 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.765 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:43.787 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:43.787 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls -l 2026-03-24T17:19:43.788 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*2.*' 2026-03-24T17:19:43.815 INFO:tasks.workunit.client.0.vm01.stdout:test2 1 MiB 2 2026-03-24T17:19:43.815 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv test2 --expires-at '3600 sec' 2026-03-24T17:19:43.855 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image test2 will expire at 2026-03-24T18:19:43.835731+0000 2026-03-24T17:19:43.859 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:43.859 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test2 2026-03-24T17:19:43.881 INFO:tasks.workunit.client.0.vm01.stdout:2243b808fd91 test2 2026-03-24T17:19:43.881 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:43.881 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:43.881 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:43.903 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:43.903 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:43.903 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test2.*USER.*protected until' 2026-03-24T17:19:43.929 INFO:tasks.workunit.client.0.vm01.stdout:2243b808fd91 test2 USER Tue Mar 24 17:19:43 2026 protected until Tue Mar 24 18:19:43 2026 2026-03-24T17:19:43.929 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm 2243b808fd91 2026-03-24T17:19:43.929 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'Deferment time has not expired' 2026-03-24T17:19:43.950 INFO:tasks.workunit.client.0.vm01.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-24T17:19:43.951 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm --image-id 2243b808fd91 --force 2026-03-24T17:19:43.991 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:43.994 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T17:19:44.027 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@snap1 2026-03-24T17:19:44.551 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:44.558 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect test1@snap1 2026-03-24T17:19:44.591 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@snap1 clone 2026-03-24T17:19:44.639 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv test1 2026-03-24T17:19:44.682 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:44.682 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:44.703 INFO:tasks.workunit.client.0.vm01.stdout:229c76094c84 test1 2026-03-24T17:19:44.703 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:44.703 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:44.703 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:44.724 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:44.724 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:44.724 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*USER.*' 2026-03-24T17:19:44.750 INFO:tasks.workunit.client.0.vm01.stdout:229c76094c84 test1 USER Tue Mar 24 17:19:44 2026 expired at Tue Mar 24 17:19:44 2026 2026-03-24T17:19:44.750 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:44.750 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v 'protected until' 2026-03-24T17:19:44.776 INFO:tasks.workunit.client.0.vm01.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-24T17:19:44.776 INFO:tasks.workunit.client.0.vm01.stdout:229c76094c84 test1 USER Tue Mar 24 17:19:44 2026 expired at Tue Mar 24 17:19:44 2026 2026-03-24T17:19:44.776 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:44.776 INFO:tasks.workunit.client.0.vm01.stderr:++ cut -d ' ' -f 1 2026-03-24T17:19:44.800 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=229c76094c84 2026-03-24T17:19:44.800 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls --image-id 229c76094c84 2026-03-24T17:19:44.800 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:19:44.800 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:44.800 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:44.826 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:44.826 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls --image-id 229c76094c84 2026-03-24T17:19:44.826 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '.*snap1.*' 2026-03-24T17:19:44.854 INFO:tasks.workunit.client.0.vm01.stdout: 18 snap1 1 MiB yes Tue Mar 24 17:19:44 2026 2026-03-24T17:19:44.854 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd children --image-id 229c76094c84 2026-03-24T17:19:44.854 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:44.854 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:44.884 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:44.884 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd children --image-id 229c76094c84 2026-03-24T17:19:44.884 INFO:tasks.workunit.client.0.vm01.stderr:+ grep clone 2026-03-24T17:19:44.914 INFO:tasks.workunit.client.0.vm01.stdout:rbd/clone 2026-03-24T17:19:44.915 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm clone 2026-03-24T17:19:44.974 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:19:44.970+0000 7fca81237640 0 -- 192.168.123.101:0/4051104100 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55eb37133320 msgr2=0x55eb371688a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:19:44.978 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:44.981 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect --image-id 229c76094c84 --snap snap1 2026-03-24T17:19:45.014 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm --image-id 229c76094c84 --snap snap1 2026-03-24T17:19:45.552 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:45.561 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls --image-id 229c76094c84 2026-03-24T17:19:45.561 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:19:45.561 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:45.561 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:45.588 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:45.588 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash restore 229c76094c84 2026-03-24T17:19:45.617 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@snap1 2026-03-24T17:19:46.558 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:46.565 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@snap2 2026-03-24T17:19:46.849 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:46.857 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls --image-id 229c76094c84 2026-03-24T17:19:46.857 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:19:46.857 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:46.857 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:19:46.883 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:19:46.883 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 229c76094c84 2026-03-24T17:19:48.849 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:19:48.856 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls --image-id 229c76094c84 2026-03-24T17:19:48.856 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:19:48.856 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:48.856 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:48.881 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:48.881 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_to_trash_on_remove=true --rbd_move_to_trash_on_remove_expire_seconds=3600 test1 2026-03-24T17:19:48.919 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:48.923 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:48.923 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:19:48.944 INFO:tasks.workunit.client.0.vm01.stdout:229c76094c84 test1 2026-03-24T17:19:48.944 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:48.944 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:48.944 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:48.965 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:48.965 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -l 2026-03-24T17:19:48.965 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'test1.*USER.*protected until' 2026-03-24T17:19:48.991 INFO:tasks.workunit.client.0.vm01.stdout:229c76094c84 test1 USER Tue Mar 24 17:19:48 2026 protected until Tue Mar 24 18:19:48 2026 2026-03-24T17:19:48.991 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm 229c76094c84 2026-03-24T17:19:48.991 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'Deferment time has not expired' 2026-03-24T17:19:49.013 INFO:tasks.workunit.client.0.vm01.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-24T17:19:49.014 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm --image-id 229c76094c84 --force 2026-03-24T17:19:49.060 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:19:49.063 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:49.063 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.122 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.181 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.241 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.305 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.363 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.427 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.495 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.556 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.818 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.881 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:49.943 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.009 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.073 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.134 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.196 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.257 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.318 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.379 INFO:tasks.workunit.client.0.vm01.stderr:+ test_purge 2026-03-24T17:19:50.379 INFO:tasks.workunit.client.0.vm01.stdout:testing trash purge... 2026-03-24T17:19:50.379 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing trash purge...' 2026-03-24T17:19:50.379 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:19:50.379 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.440 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.501 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.560 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.622 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.683 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.745 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.809 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.873 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.934 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:50.998 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.061 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.124 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.186 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.248 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.308 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.369 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.431 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:19:51.496 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:51.496 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:51.496 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:51.517 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:51.518 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:51.536 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete...done. 2026-03-24T17:19:51.539 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:51.570 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T17:19:51.604 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:19:51.649 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:19:51.690 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:51.690 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:51.690 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:19:51.712 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:19:51.712 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:51.774 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 50% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:51.777 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:51.777 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:51.777 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:51.799 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:51.799 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:51.832 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T17:19:51.866 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 --expires-at '1 hour' 2026-03-24T17:19:51.907 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image testimg1 will expire at 2026-03-24T18:19:51.887480+0000 2026-03-24T17:19:51.911 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 --expires-at '3 hours' 2026-03-24T17:19:51.952 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image testimg2 will expire at 2026-03-24T20:19:51.932886+0000 2026-03-24T17:19:51.955 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:51.955 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:51.955 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:19:51.977 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:19:51.977 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:51.996 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete...done. 2026-03-24T17:19:51.998 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:51.998 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:51.998 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:19:52.020 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:19:52.020 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge --expired-before 'now + 2 hours' 2026-03-24T17:19:52.068 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:52.072 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:52.072 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:52.072 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:52.094 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:52.095 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:52.095 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:19:52.116 INFO:tasks.workunit.client.0.vm01.stdout:23ea2ddf3fc4 testimg2 2026-03-24T17:19:52.116 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge --expired-before 'now + 4 hours' 2026-03-24T17:19:52.156 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:52.159 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:52.159 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:52.159 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:52.179 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:52.180 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:52.212 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1@snap 2026-03-24T17:19:53.065 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:53.072 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T17:19:53.103 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:19:53.137 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:19:53.179 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:19:53.219 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:19:53.265 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:53.265 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:53.265 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:19:53.287 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:19:53.287 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:53.287 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:19:53.374 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:19:53.374 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:53.374 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:53.374 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:53.397 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:53.397 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:53.397 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:19:53.419 INFO:tasks.workunit.client.0.vm01.stdout:240affabc420 testimg1 2026-03-24T17:19:53.419 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:53.419 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{ print $1 }' 2026-03-24T17:19:53.441 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=240affabc420 2026-03-24T17:19:53.441 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 240affabc420 2026-03-24T17:19:54.069 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:19:54.077 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:54.118 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:54.121 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:54.121 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:54.121 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:54.143 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:54.144 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:54.175 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T17:19:54.205 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap 2026-03-24T17:19:55.074 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:55.080 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:19:55.115 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:19:55.157 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:19:55.198 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:19:55.243 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:55.244 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:55.244 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:19:55.265 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:19:55.265 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:55.265 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:19:55.363 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:19:55.363 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:55.363 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:55.363 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:55.385 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:55.386 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:55.386 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:19:55.409 INFO:tasks.workunit.client.0.vm01.stdout:243a34f869b testimg2 2026-03-24T17:19:55.410 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:55.410 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{ print $1 }' 2026-03-24T17:19:55.434 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=243a34f869b 2026-03-24T17:19:55.434 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 243a34f869b 2026-03-24T17:19:56.126 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:19:56.133 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:56.175 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:56.180 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:56.180 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:56.180 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:56.404 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:56.404 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:56.440 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T17:19:56.472 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:19:56.508 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg3@snap 2026-03-24T17:19:56.853 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:56.860 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:19:56.903 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:19:56.943 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:19:56.987 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:56.987 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:56.987 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:19:57.009 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:19:57.010 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:57.010 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:19:57.099 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:19:57.099 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:57.099 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:57.099 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:19:57.123 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:19:57.124 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:57.124 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg3 2026-03-24T17:19:57.147 INFO:tasks.workunit.client.0.vm01.stdout:246a7967813f testimg3 2026-03-24T17:19:57.147 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:19:57.147 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{ print $1 }' 2026-03-24T17:19:57.172 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=246a7967813f 2026-03-24T17:19:57.172 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 246a7967813f 2026-03-24T17:19:58.132 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:19:58.139 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:19:58.182 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:19:58.186 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:19:58.186 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:19:58.186 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:19:58.209 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:19:58.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:19:58.245 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1@snap 2026-03-24T17:19:59.139 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:19:59.146 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T17:19:59.191 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg1@snap 2026-03-24T17:19:59.223 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:19:59.229 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:19:59.263 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap 2026-03-24T17:20:00.143 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:00.149 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T17:20:00.368 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:00.362+0000 7fcc75857640 0 --2- 192.168.123.101:0/3952069384 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x56242e645690 0x56242e706b10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:20:00.398 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T17:20:00.445 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg2@snap 2026-03-24T17:20:00.662 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:00.658+0000 7f7403fff640 0 --2- 192.168.123.101:0/1276415198 >> v2:192.168.123.101:3300/0 conn(0x561ccc457ab0 0x561ccc498e30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:20:00.679 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:00.685 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg4@snap 2026-03-24T17:20:01.146 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:01.154 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T17:20:01.211 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg4@snap 2026-03-24T17:20:01.249 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:01.257 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:20:01.301 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:20:01.345 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:20:01.392 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg4 2026-03-24T17:20:01.437 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.437 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:01.437 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:01.459 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:01.459 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:01.459 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:01.586 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:01.586 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.586 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:01.586 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:01.607 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:01.608 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.608 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:01.628 INFO:tasks.workunit.client.0.vm01.stdout:24902d7a525e testimg1 2026-03-24T17:20:01.628 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.628 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:01.649 INFO:tasks.workunit.client.0.vm01.stdout:24951371288b testimg2 2026-03-24T17:20:01.649 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.649 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg4 2026-03-24T17:20:01.670 INFO:tasks.workunit.client.0.vm01.stdout:24a180d6a8aa testimg4 2026-03-24T17:20:01.670 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg6 2026-03-24T17:20:01.710 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.710 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:01.710 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:01.731 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:01.731 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:01.731 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:01.929 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:01.929 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.929 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:01.929 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:20:01.950 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:20:01.951 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.951 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:01.971 INFO:tasks.workunit.client.0.vm01.stdout:24902d7a525e testimg1 2026-03-24T17:20:01.971 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:01.971 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:01.992 INFO:tasks.workunit.client.0.vm01.stdout:24951371288b testimg2 2026-03-24T17:20:01.992 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg5 2026-03-24T17:20:02.037 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:02.037 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:02.037 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:02.058 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:02.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:02.117 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:02.110+0000 7f1701952640 0 -- 192.168.123.101:0/1339523907 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f16e40573c0 msgr2=0x7f16e4057830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:02.123 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:02.118+0000 7f17033dc640 0 -- 192.168.123.101:0/1339523907 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x561fc67beef0 msgr2=0x561fc67f0240 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:03.883 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:20:03.887 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:03.887 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:03.887 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:20:03.908 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:03.908 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:20:03.944 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1@snap 2026-03-24T17:20:04.859 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:04.866 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T17:20:04.913 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg1@snap 2026-03-24T17:20:04.943 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:04.950 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:20:04.986 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg3@snap 2026-03-24T17:20:05.869 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:05.876 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap 2026-03-24T17:20:06.853 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:06.860 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T17:20:06.909 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T17:20:06.957 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg2@snap 2026-03-24T17:20:06.989 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:06.995 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg4@snap 2026-03-24T17:20:07.875 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:07.882 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T17:20:07.930 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg4@snap 2026-03-24T17:20:07.965 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:07.971 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg1 2026-03-24T17:20:08.013 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg2 2026-03-24T17:20:08.056 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:20:08.099 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg4 2026-03-24T17:20:08.141 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:08.141 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:08.141 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:08.162 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:08.162 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:08.162 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:08.240 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:08.240 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:08.240 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:08.240 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:08.261 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:08.261 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg6 2026-03-24T17:20:08.303 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:08.303 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:08.303 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 5 2026-03-24T17:20:08.325 INFO:tasks.workunit.client.0.vm01.stdout:5 2026-03-24T17:20:08.325 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:08.325 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:08.992 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:08.992 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:08.992 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:08.992 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:09.013 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:09.013 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:09.014 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:09.034 INFO:tasks.workunit.client.0.vm01.stdout:24ec3ff751b6 testimg1 2026-03-24T17:20:09.034 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:09.035 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:09.058 INFO:tasks.workunit.client.0.vm01.stdout:24f2ef8bb5e testimg2 2026-03-24T17:20:09.059 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:09.059 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg3 2026-03-24T17:20:09.085 INFO:tasks.workunit.client.0.vm01.stdout:24f8c54e5731 testimg3 2026-03-24T17:20:09.085 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg5 2026-03-24T17:20:09.130 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:09.130 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:09.130 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:09.150 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:09.151 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:09.151 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:11.176 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:11.176 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:11.176 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:11.176 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:20:11.200 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:20:11.200 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:11.200 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg3 2026-03-24T17:20:11.224 INFO:tasks.workunit.client.0.vm01.stdout:24f8c54e5731 testimg3 2026-03-24T17:20:11.224 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:20:11.224 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{ print $1 }' 2026-03-24T17:20:11.247 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=24f8c54e5731 2026-03-24T17:20:11.247 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 24f8c54e5731 2026-03-24T17:20:11.850 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:20:11.858 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:11.900 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:20:11.903 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:11.903 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:11.903 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:20:11.925 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:11.925 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:20:11.959 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1@snap 2026-03-24T17:20:12.857 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:12.864 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T17:20:12.908 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg1@snap 2026-03-24T17:20:12.939 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:12.946 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:20:12.979 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap 2026-03-24T17:20:13.860 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:13.868 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T17:20:13.919 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T17:20:13.967 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg2@snap 2026-03-24T17:20:14.001 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:14.008 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg4@snap 2026-03-24T17:20:14.862 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:14.869 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T17:20:14.923 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg4@snap 2026-03-24T17:20:14.960 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:14.967 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-24T17:20:15.017 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:15.021 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-24T17:20:15.072 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:15.075 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:20:15.120 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-24T17:20:15.170 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:15.173 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.173 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:15.173 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:15.195 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:15.195 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:15.195 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:15.318 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:15.318 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.318 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:15.318 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:15.342 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:15.342 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.342 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:15.363 INFO:tasks.workunit.client.0.vm01.stdout:25568556e724 testimg1 2026-03-24T17:20:15.363 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.363 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:15.385 INFO:tasks.workunit.client.0.vm01.stdout:255ccb20fffe testimg2 2026-03-24T17:20:15.385 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.385 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg4 2026-03-24T17:20:15.406 INFO:tasks.workunit.client.0.vm01.stdout:2568f6ed1da1 testimg4 2026-03-24T17:20:15.406 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg6 2026-03-24T17:20:15.451 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.451 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:15.451 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:15.472 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:15.472 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:15.472 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:15.919 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:15.919 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.919 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:15.920 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2 2026-03-24T17:20:15.941 INFO:tasks.workunit.client.0.vm01.stdout:2 2026-03-24T17:20:15.941 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.941 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:15.961 INFO:tasks.workunit.client.0.vm01.stdout:25568556e724 testimg1 2026-03-24T17:20:15.961 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:15.961 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:16.178 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:16.174+0000 7f991be08640 0 --2- 192.168.123.101:0/276855189 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x55c73d7274f0 0x55c73d7226f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:20:16.184 INFO:tasks.workunit.client.0.vm01.stdout:255ccb20fffe testimg2 2026-03-24T17:20:16.184 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg5 2026-03-24T17:20:16.228 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:16.228 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:16.228 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:16.250 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:16.250 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:16.311 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:16.306+0000 7ff327d8d640 0 -- 192.168.123.101:0/2655669331 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7ff304046730 msgr2=0x7ff304066b10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:16.316 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:16.310+0000 7ff32758c640 0 -- 192.168.123.101:0/2655669331 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7ff300006bb0 msgr2=0x7ff300026f90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:17.902 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:20:17.906 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:17.906 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:17.906 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:20:17.928 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:17.928 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T17:20:17.961 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1@snap 2026-03-24T17:20:18.876 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:18.883 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T17:20:18.926 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg1@snap 2026-03-24T17:20:18.955 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:18.962 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T17:20:18.998 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg3@snap 2026-03-24T17:20:19.883 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:19.890 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap 2026-03-24T17:20:20.882 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:20.889 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T17:20:20.935 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T17:20:20.980 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg2@snap 2026-03-24T17:20:21.012 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:21.018 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg4@snap 2026-03-24T17:20:21.858 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:21.865 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T17:20:21.919 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm testimg4@snap 2026-03-24T17:20:21.957 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:21.963 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-24T17:20:22.220 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:22.223 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-24T17:20:22.274 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:22.277 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg3 2026-03-24T17:20:22.321 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-24T17:20:22.370 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:22.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:22.373 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:22.373 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:22.394 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:22.395 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:22.395 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:22.470 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:22.470 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:22.470 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:22.470 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:22.491 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:22.491 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg6 2026-03-24T17:20:22.535 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:22.535 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:22.535 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 5 2026-03-24T17:20:22.556 INFO:tasks.workunit.client.0.vm01.stdout:5 2026-03-24T17:20:22.556 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:22.556 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:22.958 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:22.959 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:22.959 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:22.959 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 3 2026-03-24T17:20:22.981 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:20:22.982 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:22.982 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg1 2026-03-24T17:20:23.004 INFO:tasks.workunit.client.0.vm01.stdout:25b34f7cc0a testimg1 2026-03-24T17:20:23.004 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:23.004 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg2 2026-03-24T17:20:23.027 INFO:tasks.workunit.client.0.vm01.stdout:25b97cbdb1d8 testimg2 2026-03-24T17:20:23.027 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:23.027 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg3 2026-03-24T17:20:23.051 INFO:tasks.workunit.client.0.vm01.stdout:25bf89b9bbb0 testimg3 2026-03-24T17:20:23.051 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv testimg5 2026-03-24T17:20:23.298 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:23.298 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:23.298 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 4 2026-03-24T17:20:23.320 INFO:tasks.workunit.client.0.vm01.stdout:4 2026-03-24T17:20:23.320 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:23.320 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'some expired images could not be removed' 2026-03-24T17:20:24.941 INFO:tasks.workunit.client.0.vm01.stdout:rbd: some expired images could not be removed 2026-03-24T17:20:24.941 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:24.941 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:24.941 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:20:24.964 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:20:24.964 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:24.964 INFO:tasks.workunit.client.0.vm01.stderr:+ grep testimg3 2026-03-24T17:20:24.987 INFO:tasks.workunit.client.0.vm01.stdout:25bf89b9bbb0 testimg3 2026-03-24T17:20:24.987 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash ls 2026-03-24T17:20:24.987 INFO:tasks.workunit.client.0.vm01.stderr:++ awk '{ print $1 }' 2026-03-24T17:20:25.010 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=25bf89b9bbb0 2026-03-24T17:20:25.010 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge --image-id 25bf89b9bbb0 2026-03-24T17:20:25.900 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:20:25.909 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge 2026-03-24T17:20:25.952 INFO:tasks.workunit.client.0.vm01.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T17:20:25.955 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls 2026-03-24T17:20:25.955 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:25.955 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 0 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stderr:+ test_deep_copy_clone 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stdout:testing deep copy clone... 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing deep copy clone...' 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:20:25.977 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.240 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.301 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.362 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.421 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.481 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.648 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.709 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.769 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.830 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.895 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:26.954 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.016 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.077 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.138 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.203 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.264 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.324 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:27.385 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create testimg1 --image-format 2 --size 256 2026-03-24T17:20:27.418 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T17:20:27.907 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:27.914 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect testimg1@snap1 2026-03-24T17:20:27.943 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-24T17:20:27.988 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap2 2026-03-24T17:20:28.913 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:28.920 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy testimg2 testimg3 2026-03-24T17:20:28.973 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:28.966+0000 7fb827fff640 0 -- 192.168.123.101:0/3033575979 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fb804057ef0 msgr2=0x7fb80405b270 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:20:29.921 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete...2026-03-24T17:20:29.914+0000 7fb82cfd7640 0 -- 192.168.123.101:0/3033575979 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fb808004a30 msgr2=0x7fb808024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:29.923 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 100% complete...done. 2026-03-24T17:20:29.927 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:20:29.927 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:20:29.958 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:20:29.958 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:20:29.958 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: rbd/testimg1@snap1' 2026-03-24T17:20:29.988 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/testimg1@snap1 2026-03-24T17:20:29.988 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg3 2026-03-24T17:20:29.988 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:20:29.988 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:29.988 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:20:30.020 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:20:30.020 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg3 2026-03-24T17:20:30.020 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '.*snap2.*' 2026-03-24T17:20:30.052 INFO:tasks.workunit.client.0.vm01.stdout: 40 snap2 256 MiB Tue Mar 24 17:20:29 2026 2026-03-24T17:20:30.052 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg2 2026-03-24T17:20:30.052 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T17:20:30.085 INFO:tasks.workunit.client.0.vm01.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T17:20:30.085 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:20:30.085 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T17:20:30.118 INFO:tasks.workunit.client.0.vm01.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T17:20:30.119 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten testimg2 2026-03-24T17:20:30.157 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T17:20:30.166 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten testimg3 2026-03-24T17:20:30.204 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T17:20:30.212 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-24T17:20:30.253 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge testimg2 2026-03-24T17:20:30.918 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:20:30.928 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge testimg3 2026-03-24T17:20:31.856 INFO:tasks.workunit.client.0.vm01.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T17:20:31.866 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg2 2026-03-24T17:20:31.931 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T17:20:31.926+0000 7f5ac67f5640 0 -- 192.168.123.101:0/2079460636 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x562c30a09380 msgr2=0x562c309f9050 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:31.938 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:31.941 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm testimg3 2026-03-24T17:20:32.006 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T17:20:32.002+0000 7fb64ce7c640 0 -- 192.168.123.101:0/2675420117 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55ef8ab5b320 msgr2=0x55ef8ab908a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:32.011 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:32.015 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect testimg1@snap1 2026-03-24T17:20:32.044 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-24T17:20:32.088 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create testimg2@snap2 2026-03-24T17:20:32.928 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:32.935 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd deep copy --flatten testimg2 testimg3 2026-03-24T17:20:34.007 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:34.002+0000 7f661f9af640 0 -- 192.168.123.101:0/2996986988 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f65fc0047d0 msgr2=0x7f65fc025140 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:20:34.145 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete...2026-03-24T17:20:34.138+0000 7f6620c38640 0 -- 192.168.123.101:0/2996986988 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e491785170 msgr2=0x55e4918c6260 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:35.115 INFO:tasks.workunit.client.0.vm01.stderr: Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T17:20:35.121 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:20:35.121 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'size 256 MiB' 2026-03-24T17:20:35.152 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:20:35.152 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg3 2026-03-24T17:20:35.152 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v parent: 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout:rbd image 'testimg3': 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: size 256 MiB in 64 objects 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: order 22 (4 MiB objects) 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: snapshot_count: 1 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: id: 26ca11530a76 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: block_name_prefix: rbd_data.26ca11530a76 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: format: 2 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: op_features: 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: flags: 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: create_timestamp: Tue Mar 24 17:20:32 2026 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: access_timestamp: Tue Mar 24 17:20:32 2026 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stdout: modify_timestamp: Tue Mar 24 17:20:32 2026 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg3 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -v SNAPID 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:35.180 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 1 2026-03-24T17:20:35.206 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:20:35.206 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap ls testimg3 2026-03-24T17:20:35.206 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '.*snap2.*' 2026-03-24T17:20:35.233 INFO:tasks.workunit.client.0.vm01.stdout: 42 snap2 256 MiB Tue Mar 24 17:20:33 2026 2026-03-24T17:20:35.233 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info testimg2 2026-03-24T17:20:35.233 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T17:20:35.263 INFO:tasks.workunit.client.0.vm01.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T17:20:35.263 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten testimg2 2026-03-24T17:20:35.297 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T17:20:35.305 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-24T17:20:35.337 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:20:35.337 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:36.011 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:36.978 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.169 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.233 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.294 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.360 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.422 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.483 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.544 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.607 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.670 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.734 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.804 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.868 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.931 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:38.999 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.066 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.136 INFO:tasks.workunit.client.0.vm01.stderr:+ test_clone_v2 2026-03-24T17:20:39.136 INFO:tasks.workunit.client.0.vm01.stdout:testing clone v2... 2026-03-24T17:20:39.136 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing clone v2...' 2026-03-24T17:20:39.136 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:20:39.136 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.200 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.262 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.325 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.389 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.453 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.519 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.580 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.642 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.702 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.763 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.825 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.889 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:39.955 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:40.018 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:40.091 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:40.152 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:40.215 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:40.277 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T17:20:40.311 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@1 2026-03-24T17:20:40.948 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:40.955 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test2 2026-03-24T17:20:40.981 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:40.974+0000 7f7c62ffd640 -1 librbd::image::CloneRequest: 0x55b41f0e2ce0 validate_parent: parent snapshot must be protected 2026-03-24T17:20:40.981 INFO:tasks.workunit.client.0.vm01.stderr:rbd: clone error: (22) Invalid argument 2026-03-24T17:20:40.984 INFO:tasks.workunit.client.0.vm01.stderr:+ true 2026-03-24T17:20:40.984 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 test1@1 test2 2026-03-24T17:20:41.030 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd snap ls test1 --format json 2026-03-24T17:20:41.030 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.[] | select(.name == "1") | .id' 2026-03-24T17:20:41.056 INFO:tasks.workunit.client.0.vm01.stderr:+ SNAP_ID=43 2026-03-24T17:20:41.056 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=2 --snap-id 43 test1 test3 2026-03-24T17:20:41.103 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect test1@1 2026-03-24T17:20:41.134 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test4 2026-03-24T17:20:41.194 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd children test1@1 2026-03-24T17:20:41.194 INFO:tasks.workunit.client.0.vm01.stderr:+ sort 2026-03-24T17:20:41.194 INFO:tasks.workunit.client.0.vm01.stderr:+ tr '\n' ' ' 2026-03-24T17:20:41.194 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-24T17:20:41.228 INFO:tasks.workunit.client.0.vm01.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-24T17:20:41.228 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd children --descendants test1 2026-03-24T17:20:41.228 INFO:tasks.workunit.client.0.vm01.stderr:+ sort 2026-03-24T17:20:41.228 INFO:tasks.workunit.client.0.vm01.stderr:+ tr '\n' ' ' 2026-03-24T17:20:41.228 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-24T17:20:41.266 INFO:tasks.workunit.client.0.vm01.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-24T17:20:41.266 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove test4 2026-03-24T17:20:41.329 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:41.322+0000 7f1a2996c640 0 -- 192.168.123.101:0/2428517766 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f1a0c008d30 msgr2=0x7f1a0c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:41.334 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:41.337 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect test1@1 2026-03-24T17:20:41.371 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap remove test1@1 2026-03-24T17:20:41.402 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:41.409 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap list --all test1 2026-03-24T17:20:41.409 INFO:tasks.workunit.client.0.vm01.stderr:+ grep -E 'trash \(user 1\) *$' 2026-03-24T17:20:41.435 INFO:tasks.workunit.client.0.vm01.stdout: 43 9939cfdc-d73b-4929-b7ca-5bc5d831c904 1 MiB Tue Mar 24 17:20:40 2026 trash (user 1) 2026-03-24T17:20:41.436 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@2 2026-03-24T17:20:42.071 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:42.080 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:42.080 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'image has snapshots' 2026-03-24T17:20:42.124 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image has snapshots - these must be deleted with 'rbd snap purge' before the image can be removed. 2026-03-24T17:20:42.124 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm test1@2 2026-03-24T17:20:43.076 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:43.087 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:43.087 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'linked clones' 2026-03-24T17:20:43.132 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T17:20:43.132 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test3 2026-03-24T17:20:43.197 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:43.190+0000 7fb0901ec640 0 -- 192.168.123.101:0/125356098 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5647557fd320 msgr2=0x564755832390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:43.203 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:43.207 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:43.207 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'linked clones' 2026-03-24T17:20:43.249 INFO:tasks.workunit.client.0.vm01.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T17:20:43.249 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd flatten test2 2026-03-24T17:20:44.087 INFO:tasks.workunit.client.0.vm01.stderr: Image flatten: 100% complete...done. 2026-03-24T17:20:44.096 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap list --all test1 2026-03-24T17:20:44.096 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:44.096 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:20:44.123 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:44.123 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:44.189 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:44.182+0000 7f4622239640 0 -- 192.168.123.101:0/1495960550 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55be19885f60 msgr2=0x55be198a63e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:44.193 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:44.198 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:20:44.260 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:44.254+0000 7fd08fb02640 0 -- 192.168.123.101:0/1577452767 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd07005bfc0 msgr2=0x7fd07007c3c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:44.268 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:44.272 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T17:20:44.308 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@1 2026-03-24T17:20:45.085 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:45.092 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create test1@2 2026-03-24T17:20:46.091 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:20:46.098 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@1 test2 --rbd-default-clone-format 2 2026-03-24T17:20:46.144 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone test1@2 test3 --rbd-default-clone-format 2 2026-03-24T17:20:46.186 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm test1@1 2026-03-24T17:20:46.214 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:46.220 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm test1@2 2026-03-24T17:20:46.248 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:20:46.254 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd rm test1 2026-03-24T17:20:46.254 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:46.286 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:46.282+0000 7f38ee757200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-24T17:20:46.286 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:20:46.289 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T17:20:46.292 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:20:46.293 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 --rbd-move-parent-to-trash-on-remove=true 2026-03-24T17:20:46.339 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:46.343 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -a 2026-03-24T17:20:46.343 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:20:46.364 INFO:tasks.workunit.client.0.vm01.stdout:27fde964e2de test1 2026-03-24T17:20:46.365 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test2 2026-03-24T17:20:46.418 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:46.414+0000 7fe4bffff640 0 -- 192.168.123.101:0/2705066430 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe49c008d30 msgr2=0x7fe4a407cd50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:46.424 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:46.418+0000 7fe4c4af2640 0 -- 192.168.123.101:0/2705066430 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x55e01874a240 msgr2=0x55e01876a6c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:47.082 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:47.087 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:20:47.087 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -a 2026-03-24T17:20:47.108 INFO:tasks.workunit.client.0.vm01.stdout:27fde964e2de test1 2026-03-24T17:20:47.109 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test3 2026-03-24T17:20:47.166 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:47.158+0000 7f14a559d640 0 -- 192.168.123.101:0/1523418123 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55bd2c1d1320 msgr2=0x55bd2c206390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:47.175 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:47.170+0000 7f14a559d640 0 -- 192.168.123.101:0/1523418123 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f148405bea0 msgr2=0x7f148407c2a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:48.120 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:48.124 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls -a 2026-03-24T17:20:48.124 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep test1 2026-03-24T17:20:48.124 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stdout:testing thick provision... 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stderr:+ test_thick_provision 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing thick provision...' 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:20:48.148 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.215 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.280 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.343 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.406 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.469 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.533 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.596 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.659 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.721 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.784 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.847 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.910 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:48.974 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:49.037 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:49.107 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:49.175 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:49.240 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:20:49.301 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --thick-provision -s 64M test1 2026-03-24T17:20:49.705 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ count=0 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ ret= 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ tr -s ' ' 2026-03-24T17:20:49.712 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^64 MiB' 2026-03-24T17:20:49.713 INFO:tasks.workunit.client.0.vm01.stderr:+ cut -d ' ' -f 4-5 2026-03-24T17:20:49.739 INFO:tasks.workunit.client.0.vm01.stdout:64 MiB 2026-03-24T17:20:49.739 INFO:tasks.workunit.client.0.vm01.stderr:+ ret=0 2026-03-24T17:20:49.739 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 = 0 ']' 2026-03-24T17:20:49.739 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:20:49.739 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:20:49.764 INFO:tasks.workunit.client.0.vm01.stdout:NAME PROVISIONED USED 2026-03-24T17:20:49.764 INFO:tasks.workunit.client.0.vm01.stdout:test1 64 MiB 64 MiB 2026-03-24T17:20:49.767 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T17:20:49.767 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:20:49.866 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete...2026-03-24T17:20:49.862+0000 7fb0775a1640 0 -- 192.168.123.101:0/3387930459 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55d129760320 msgr2=0x55d12973e800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:49.876 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:20:49.870+0000 7fb0775a1640 0 -- 192.168.123.101:0/3387930459 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fb05405bfd0 msgr2=0x7fb05407c3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:49.880 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:20:49.884 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:20:49.884 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:20:49.884 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:20:49.884 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:20:49.909 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:20:49.909 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --thick-provision -s 4G test1 2026-03-24T17:20:50.666 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete...2026-03-24T17:20:50.662+0000 7f8a07650640 0 -- 192.168.123.101:0/405847399 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f89e405c020 msgr2=0x7f89e407c420 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:20:50.887 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 5% complete...2026-03-24T17:20:50.882+0000 7f8a07650640 0 -- 192.168.123.101:0/405847399 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55ea375eea70 msgr2=0x7f89e407ca50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:14.164 INFO:tasks.workunit.client.0.vm01.stderr: Thick provisioning: 6% complete... Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete... Thick provisioning: 11% complete... Thick provisioning: 12% complete... Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ count=0 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ ret= 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ tr -s ' ' 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^4 GiB' 2026-03-24T17:21:14.175 INFO:tasks.workunit.client.0.vm01.stderr:+ cut -d ' ' -f 4-5 2026-03-24T17:21:14.202 INFO:tasks.workunit.client.0.vm01.stdout:4 GiB 2026-03-24T17:21:14.202 INFO:tasks.workunit.client.0.vm01.stderr:+ ret=0 2026-03-24T17:21:14.202 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 = 0 ']' 2026-03-24T17:21:14.202 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:21:14.202 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd du 2026-03-24T17:21:14.226 INFO:tasks.workunit.client.0.vm01.stdout:NAME PROVISIONED USED 2026-03-24T17:21:14.226 INFO:tasks.workunit.client.0.vm01.stdout:test1 4 GiB 4 GiB 2026-03-24T17:21:14.229 INFO:tasks.workunit.client.0.vm01.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T17:21:14.229 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm test1 2026-03-24T17:21:14.325 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete...2026-03-24T17:21:14.318+0000 7fb3812d1640 0 -- 192.168.123.101:0/1080432654 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55eaee13a320 msgr2=0x55eaee118800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:14.365 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 2% complete... Removing image: 3% complete...2026-03-24T17:21:14.358+0000 7fb3812d1640 0 -- 192.168.123.101:0/1080432654 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fb36005bf20 msgr2=0x7fb36007c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:16.511 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T17:21:16.515 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd ls 2026-03-24T17:21:16.515 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test1 2026-03-24T17:21:16.515 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:21:16.515 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:21:16.537 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:21:16.537 INFO:tasks.workunit.client.0.vm01.stdout:testing namespace... 2026-03-24T17:21:16.537 INFO:tasks.workunit.client.0.vm01.stderr:+ test_namespace 2026-03-24T17:21:16.538 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing namespace...' 2026-03-24T17:21:16.538 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:21:16.538 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.611 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.682 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.755 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.821 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.888 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:16.955 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.223 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.290 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.357 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.424 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.491 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.558 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.626 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.698 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.765 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.830 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.896 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:17.962 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace ls 2026-03-24T17:21:17.963 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:21:17.963 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:21:17.984 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:21:17.984 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd/test1 2026-03-24T17:21:18.013 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create --pool rbd --namespace test2 2026-03-24T17:21:18.043 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create --namespace test3 2026-03-24T17:21:18.077 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd namespace create rbd/test3 2026-03-24T17:21:18.077 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd/test3 2026-03-24T17:21:18.101 INFO:tasks.workunit.client.0.vm01.stderr:rbd: failed to created namespace: 2026-03-24T17:21:18.094+0000 7fe23214a200 -1 librbd::api::Namespace: create: failed to add namespace: (17) File exists 2026-03-24T17:21:18.101 INFO:tasks.workunit.client.0.vm01.stderr:(17) File exists 2026-03-24T17:21:18.104 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:18.105 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace list 2026-03-24T17:21:18.105 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test 2026-03-24T17:21:18.105 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:21:18.105 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^3$' 2026-03-24T17:21:18.127 INFO:tasks.workunit.client.0.vm01.stdout:3 2026-03-24T17:21:18.127 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd namespace remove --pool rbd missing 2026-03-24T17:21:18.127 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace remove --pool rbd missing 2026-03-24T17:21:18.141 INFO:tasks.workunit.client.0.vm01.stderr:rbd: namespace name was not specified 2026-03-24T17:21:18.143 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:18.143 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image1 2026-03-24T17:21:18.176 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/test1/image1 2026-03-24T17:21:18.209 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-24T17:21:18.293 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:18.286+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5622c30895e0 msgr2=0x5622c3079f90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:18.757 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:18.750+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c07c960 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:19.040 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:19.034+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c07c960 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:19.206 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:19.202+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd42c05bef0 msgr2=0x7fd42c07c2f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:19.238 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:19.230+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd42c05bef0 msgr2=0x7fd42c07c2f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:19.302 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:21:19.302 INFO:tasks.workunit.client.0.vm01.stdout: 1 3088 2842.49 11 MiB/s 2026-03-24T17:21:20.265 INFO:tasks.workunit.client.0.vm01.stdout: 2 5104 2495.12 9.7 MiB/s 2026-03-24T17:21:20.359 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:20.354+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c1c22b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:20.412 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:20.406+0000 7fd44ddfe640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fd42c05bef0 msgr2=0x7fd42c07c2f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:20.981 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:20.974+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5622c30895e0 msgr2=0x7fd42c1c16f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:20.991 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:20.986+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd42c05bef0 msgr2=0x7fd42c07c2d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:21.894 INFO:tasks.workunit.client.0.vm01.stdout: 3 6896 1878.26 7.3 MiB/s 2026-03-24T17:21:22.211 INFO:tasks.workunit.client.0.vm01.stdout: 4 7808 1956 7.6 MiB/s 2026-03-24T17:21:22.319 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:22.314+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5622c30895e0 msgr2=0x7fd42c08a930 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:22.479 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:22.474+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c08ae70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:22.483 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:22.478+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c08ae70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:22.684 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:22.678+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c08c2d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:23.032 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:23.026+0000 7fd44f888640 0 -- 192.168.123.101:0/1629552713 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fd430012c10 msgr2=0x7fd42c08ba40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:23.596 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 5 ops: 8192 ops/sec: 1520.41 bytes/sec: 5.9 MiB/s 2026-03-24T17:21:23.605 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd/test1/image1@1 2026-03-24T17:21:24.578 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:21:24.586 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/test1/image1@1 rbd/test2/image1 2026-03-24T17:21:24.637 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm rbd/test1/image1@1 2026-03-24T17:21:24.667 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:21:24.675 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export rbd/test1/image1 - 2026-03-24T17:21:24.675 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-24T17:21:24.675 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export rbd/test2/image1 - 2026-03-24T17:21:24.822 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-24T17:21:24.814+0000 7f0cfb895640 0 -- 192.168.123.101:0/3856644584 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55b9cafe30f0 msgr2=0x55b9cb0034d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:24.868 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-24T17:21:24.862+0000 7f0cfb895640 0 -- 192.168.123.101:0/3856644584 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f0cdc05bf20 msgr2=0x7f0cdc07c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:24.994 INFO:tasks.workunit.client.0.vm01.stderr: Exporting imageExporting image: : 2020% complete...% complete... Exporting image : Exporting image: 2121% complete...% complete... Exporting image: Exporting image22: % complete...22% complete... Exporting image : 23Exporting image% complete...: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 27% complete...2026-03-24T17:21:24.990+0000 7f27002a3640 0 -- 192.168.123.101:0/2861580887 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e0415b7c60 msgr2=0x55e0415ed810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:25.070 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete...2026-03-24T17:21:25.066+0000 7f27002a3640 0 -- 192.168.123.101:0/2861580887 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f26e005bff0 msgr2=0x7f26e007c3f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:25.850 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60 Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete...% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 69% complete... Exporting image: 72% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 99% complete... Exporting image: 100% complete...done.Exporting image: 100% complete...done. 2026-03-24T17:21:25.850 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:21:25.857 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/test2/image1 2026-03-24T17:21:25.928 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T17:21:25.922+0000 7f006783b640 0 -- 192.168.123.101:0/2134111377 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f0040008d30 msgr2=0x7f00400291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:25.939 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:25.934+0000 7f006783b640 0 -- 192.168.123.101:0/2134111377 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f004805bff0 msgr2=0x7f004807c3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:26.604 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:21:26.608 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G rbd/image2 2026-03-24T17:21:26.645 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/image2 2026-03-24T17:21:26.678 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-24T17:21:26.718 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:26.714+0000 7fe353416640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe32c008d30 msgr2=0x7fe32c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:26.719 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:26.714+0000 7fe353416640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:27.527 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:27.522+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5603941438a0 msgr2=0x5603941d3200 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:27.623 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:27.618+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5603941438a0 msgr2=0x7fe334080ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:27.791 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:21:27.791 INFO:tasks.workunit.client.0.vm01.stdout: 1 2640 2397.11 9.4 MiB/s 2026-03-24T17:21:28.721 INFO:tasks.workunit.client.0.vm01.stdout: 2 4320 2129.66 8.3 MiB/s 2026-03-24T17:21:28.769 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:28.762+0000 7fe353416640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe32c008d30 msgr2=0x7fe334080580 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:29.282 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:29.278+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5603941438a0 msgr2=0x7fe334187750 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:29.583 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:29.578+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:30.406 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.398+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:30.419 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.414+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:30.420 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.414+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5603941438a0 msgr2=0x7fe33411db00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:30.521 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.514+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe334187750 msgr2=0x7fe33407ca30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:30.523 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.518+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:30.527 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.522+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe334187750 msgr2=0x7fe33407ca30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:30.578 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:30.574+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe33405bfc0 msgr2=0x7fe33407c3a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:30.724 INFO:tasks.workunit.client.0.vm01.stdout: 4 6176 1532.67 6.0 MiB/s 2026-03-24T17:21:31.290 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:31.286+0000 7fe353416640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7fe330004930 msgr2=0x7fe334082d30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:31.486 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:31.482+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe334187750 msgr2=0x7fe33411d6c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:31.488 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:31.482+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe330004930 msgr2=0x7fe334082d30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:32.176 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:32.170+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe330004930 msgr2=0x7fe334082d30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:32.177 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:32.170+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe334187750 msgr2=0x7fe33411d6c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:32.547 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:32.542+0000 7fe35469f640 0 -- 192.168.123.101:0/1929482139 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7fe330004930 msgr2=0x7fe334082d30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:33.507 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 6 ops: 8192 ops/sec: 1199.76 bytes/sec: 4.7 MiB/s 2026-03-24T17:21:33.517 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd/image2@1 2026-03-24T17:21:34.434 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:21:34.444 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/image2@1 rbd/test2/image2 2026-03-24T17:21:34.504 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm rbd/image2@1 2026-03-24T17:21:34.537 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:21:34.546 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export rbd/image2 - 2026-03-24T17:21:34.546 INFO:tasks.workunit.client.0.vm01.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-24T17:21:34.546 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd export rbd/test2/image2 - 2026-03-24T17:21:34.937 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete...Exporting image Exporting image: 9% complete... : 8% complete...Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 14% complete...2026-03-24T17:21:34.930+0000 7f907b5ad640 0 -- 192.168.123.101:0/1349134144 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f9054008d30 msgr2=0x7f90540291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:34.980 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 15% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-24T17:21:34.974+0000 7f907d037640 0 -- 192.168.123.101:0/1349134144 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x560e64ad6c60 msgr2=0x560e64ad1e00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:35.194 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 19% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete...2026-03-24T17:21:35.190+0000 7f5bd2c8a640 0 -- 192.168.123.101:0/1022190905 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5579414d8030 msgr2=0x557941567cc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:21:35.239 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete...2026-03-24T17:21:35.234+0000 7f5bd2c8a640 0 -- 192.168.123.101:0/1022190905 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f5bb005bf20 msgr2=0x7f5bb007c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:36.149 INFO:tasks.workunit.client.0.vm01.stderr: Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. 2026-03-24T17:21:36.149 INFO:tasks.workunit.client.0.vm01.stderr:Exporting image: 100% complete...done. 2026-03-24T17:21:36.159 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd rm rbd/image2 2026-03-24T17:21:36.159 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/image2 2026-03-24T17:21:36.202 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:36.198+0000 7f36068e8200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-24T17:21:36.202 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 0% complete...failed. 2026-03-24T17:21:36.205 INFO:tasks.workunit.client.0.vm01.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T17:21:36.209 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:36.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/test2/image2 2026-03-24T17:21:36.280 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:21:36.274+0000 7f6f20884640 0 -- 192.168.123.101:0/1047086825 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f6f0005bf20 msgr2=0x7f6f0007c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:36.289 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T17:21:36.282+0000 7f6f2230e640 0 -- 192.168.123.101:0/1047086825 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x559123ca4520 msgr2=0x559123c90d50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:36.459 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:21:36.464 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/image2 2026-03-24T17:21:36.543 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete...2026-03-24T17:21:36.538+0000 7f203ffc4640 0 -- 192.168.123.101:0/3588149264 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5578a3cbcdc0 msgr2=0x5578a3da09d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:36.576 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete...2026-03-24T17:21:36.570+0000 7f203ed3b640 0 -- 192.168.123.101:0/3588149264 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x5578a3dfcb10 msgr2=0x5578a3e1cf90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:36.900 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T17:21:36.904 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image3 2026-03-24T17:21:36.940 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd/test1/image3@1 2026-03-24T17:21:37.451 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:21:37.461 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect rbd/test1/image3@1 2026-03-24T17:21:37.498 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone --rbd-default-clone-format 1 rbd/test1/image3@1 rbd/test1/image4 2026-03-24T17:21:37.556 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/test1/image4 2026-03-24T17:21:37.634 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T17:21:37.630+0000 7f7234dc9640 0 -- 192.168.123.101:0/3137351183 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f721405bf40 msgr2=0x7f721407c340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:37.644 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:21:37.648 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect rbd/test1/image3@1 2026-03-24T17:21:37.686 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap rm rbd/test1/image3@1 2026-03-24T17:21:38.444 INFO:tasks.workunit.client.0.vm01.stderr: Removing snap: 100% complete...done. 2026-03-24T17:21:38.452 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd/test1/image3 2026-03-24T17:21:38.526 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T17:21:38.522+0000 7efc23fff640 0 -- 192.168.123.101:0/3903025207 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7efc00008d30 msgr2=0x7efc000291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:38.532 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:21:38.536 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G --namespace test1 image2 2026-03-24T17:21:38.578 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd namespace remove rbd/test1 2026-03-24T17:21:38.578 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace remove rbd/test1 2026-03-24T17:21:38.604 INFO:tasks.workunit.client.0.vm01.stderr:rbd: namespace contains images which must be deleted first. 2026-03-24T17:21:38.607 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:38.607 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group create rbd/test1/group1 2026-03-24T17:21:38.638 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group image add rbd/test1/group1 rbd/test1/image1 2026-03-24T17:21:38.882 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group image add --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image2 2026-03-24T17:21:38.926 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group image rm --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image1 2026-03-24T17:21:38.971 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group image rm rbd/test1/group1 rbd/test1/image2 2026-03-24T17:21:39.008 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd group rm rbd/test1/group1 2026-03-24T17:21:39.038 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash move rbd/test1/image1 2026-03-24T17:21:39.091 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash --namespace test1 ls 2026-03-24T17:21:39.092 INFO:tasks.workunit.client.0.vm01.stderr:++ cut -d ' ' -f 1 2026-03-24T17:21:39.115 INFO:tasks.workunit.client.0.vm01.stderr:+ ID=292be3abfb0 2026-03-24T17:21:39.115 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash rm rbd/test1/292be3abfb0 2026-03-24T17:21:39.254 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete...2026-03-24T17:21:39.250+0000 7f6bfdcfb640 0 -- 192.168.123.101:0/2584382547 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5630e6bf5fc0 msgr2=0x5630e6cda210 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:39.278 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete...2026-03-24T17:21:39.274+0000 7f6bfdcfb640 0 -- 192.168.123.101:0/2584382547 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f6bdc05bfd0 msgr2=0x7f6bdc07c3d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:39.535 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T17:21:39.539 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd remove rbd/test1/image2 2026-03-24T17:21:39.617 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T17:21:39.610+0000 7f594e0de640 0 -- 192.168.123.101:0/3661046127 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x5569264aa4e0 msgr2=0x556926484e00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:21:39.625 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:21:39.628 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace remove --pool rbd --namespace test1 2026-03-24T17:21:39.681 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace remove --namespace test3 2026-03-24T17:21:39.740 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace list 2026-03-24T17:21:39.740 INFO:tasks.workunit.client.0.vm01.stderr:+ grep test 2026-03-24T17:21:39.740 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:21:39.741 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:21:39.764 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:21:39.764 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace remove rbd/test2 2026-03-24T17:21:39.821 INFO:tasks.workunit.client.0.vm01.stderr:+ test_trash_purge_schedule 2026-03-24T17:21:39.821 INFO:tasks.workunit.client.0.vm01.stdout:testing trash purge schedule... 2026-03-24T17:21:39.822 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing trash purge schedule...' 2026-03-24T17:21:39.822 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:21:39.822 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:39.898 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.179 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.258 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.334 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.411 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.489 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.565 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.843 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:40.922 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.000 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.280 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.360 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.437 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.516 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.590 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.668 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.745 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:21:41.822 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:21:42.216 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:21:42.228 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:21:45.183 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T17:21:45.212 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd trash purge schedule list 2026-03-24T17:21:45.464 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{}' = '{}' 2026-03-24T17:21:45.464 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd trash purge schedule status 2026-03-24T17:21:45.464 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep '"scheduled": []' 2026-03-24T17:21:45.716 INFO:tasks.workunit.client.0.vm01.stdout: "scheduled": [] 2026-03-24T17:21:45.716 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T17:21:45.716 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:21:45.742 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.742 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-24T17:21:45.766 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:21:45.767 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-24T17:21:45.767 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove dummy 2026-03-24T17:21:45.787 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:21:45.782+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:21:45.788 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:21:45.790 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.790 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-24T17:21:45.790 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-24T17:21:45.811 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:21:45.806+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:21:45.812 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:21:45.814 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.814 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-24T17:21:45.814 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-24T17:21:45.836 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:21:45.830+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:21:45.837 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:21:45.839 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.839 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T17:21:45.839 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T17:21:45.862 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:21:45.858+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:21:45.862 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:21:45.865 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.865 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd 1d 01:30 2026-03-24T17:21:45.894 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-24T17:21:45.894 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T17:21:45.918 INFO:tasks.workunit.client.0.vm01.stdout:every 1d starting at 01:30:00 2026-03-24T17:21:45.918 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T17:21:45.918 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:21:45.944 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:45.944 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -R 2026-03-24T17:21:45.944 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T17:21:45.968 INFO:tasks.workunit.client.0.vm01.stdout:rbd - every 1d starting at 01:30:00 2026-03-24T17:21:45.968 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -R -p rbd 2026-03-24T17:21:45.968 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T17:21:45.995 INFO:tasks.workunit.client.0.vm01.stdout:rbd - every 1d starting at 01:30:00 2026-03-24T17:21:45.995 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-24T17:21:45.995 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-24T17:21:46.021 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:46.021 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T17:21:46.047 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:21:46.047 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd2/ns1 2d 2026-03-24T17:21:46.074 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T17:21:46.100 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[{"pool":"rbd2","namespace":"ns1","items":[{"interval":"2d","start_time":""}]}]' '!=' '[]' 2026-03-24T17:21:46.100 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-24T17:21:46.100 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *every 2d' 2026-03-24T17:21:46.126 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 every 2d 2026-03-24T17:21:46.127 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 2026-03-24T17:21:46.153 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T17:21:46.182 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:21:46.182 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:21:46.183 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:21:46.183 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T17:21:46.183 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:21:46.211 INFO:tasks.workunit.client.0.vm01.stderr:+ test '' = rbd 2026-03-24T17:21:46.211 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:21:56.213 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:21:56.213 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T17:21:56.213 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:21:56.238 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd = rbd 2026-03-24T17:21:56.238 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:21:56.238 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status 2026-03-24T17:21:56.259 INFO:tasks.workunit.client.0.vm01.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-24T17:21:56.259 INFO:tasks.workunit.client.0.vm01.stdout:rbd 2026-03-25 01:30:00 2026-03-24T17:21:56.262 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T17:21:56.262 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:21:56.287 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd = rbd 2026-03-24T17:21:56.287 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-24T17:21:56.287 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:21:56.313 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd = rbd 2026-03-24T17:21:56.313 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add 2d 00:17 2026-03-24T17:21:56.343 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:21:56.343 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:21:56.366 INFO:tasks.workunit.client.0.vm01.stdout:every 2d starting at 00:17:00 2026-03-24T17:21:56.366 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -R 2026-03-24T17:21:56.367 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:21:56.391 INFO:tasks.workunit.client.0.vm01.stdout:- - every 2d starting at 00:17:00 2026-03-24T17:21:56.391 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-24T17:21:56.391 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-24T17:21:56.415 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:21:56.415 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-24T17:21:56.415 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:21:56.440 INFO:tasks.workunit.client.0.vm01.stdout:- - every 2d starting at 00:17:00 2026-03-24T17:21:56.440 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd2/ns1 -R 2026-03-24T17:21:56.440 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:21:56.465 INFO:tasks.workunit.client.0.vm01.stdout:- - every 2d starting at 00:17:00 2026-03-24T17:21:56.466 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T17:21:56.466 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //schedules/schedule/pool 2026-03-24T17:21:56.493 INFO:tasks.workunit.client.0.vm01.stderr:+ test - = - 2026-03-24T17:21:56.493 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T17:21:56.493 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //schedules/schedule/namespace 2026-03-24T17:21:56.520 INFO:tasks.workunit.client.0.vm01.stderr:+ test - = - 2026-03-24T17:21:56.520 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T17:21:56.520 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //schedules/schedule/items/item/start_time 2026-03-24T17:21:56.547 INFO:tasks.workunit.client.0.vm01.stderr:+ test 00:17:00 = 00:17:00 2026-03-24T17:21:56.547 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:21:56.547 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:21:56.547 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:21:56.548 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:21:56.548 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:21:56.572 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:06.573 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:06.573 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:06.573 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:06.573 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:06.599 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:16.600 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:16.624 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:26.626 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:26.626 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:26.626 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:26.626 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:26.651 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:36.653 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:36.653 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:36.653 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:36.653 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:36.679 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:46.681 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:46.681 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:46.681 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:46.681 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:46.707 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:22:56.708 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:56.708 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:56.708 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:56.708 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:56.734 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 2026-03-24T17:22:56.734 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 2026-03-24T17:22:56.734 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:22:56.734 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status 2026-03-24T17:22:56.753 INFO:tasks.workunit.client.0.vm01.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-24T17:22:56.753 INFO:tasks.workunit.client.0.vm01.stdout:rbd 2026-03-25 01:30:00 2026-03-24T17:22:56.753 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 2026-03-26 00:17:00 2026-03-24T17:22:56.753 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-26 00:17:00 2026-03-24T17:22:56.756 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T17:22:56.756 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:56.756 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2 2026-03-24T17:22:56.781 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 2026-03-24T17:22:56.781 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 2026-03-24T17:22:56.782 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd rbd2 rbd2' 2026-03-24T17:22:56.782 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T17:22:56.782 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:56.808 INFO:tasks.workunit.client.0.vm01.stdout:rbd rbd2 rbd2 2026-03-24T17:22:56.808 INFO:tasks.workunit.client.0.vm01.stderr:+ echo rbd rbd2 rbd2 2026-03-24T17:22:56.809 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-24T17:22:56.809 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:56.835 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd = rbd 2026-03-24T17:22:56.836 INFO:tasks.workunit.client.0.vm01.stderr:+++ rbd trash purge schedule status -p rbd2 --format xml 2026-03-24T17:22:56.836 INFO:tasks.workunit.client.0.vm01.stderr:+++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T17:22:56.863 INFO:tasks.workunit.client.0.vm01.stderr:++ echo rbd2 rbd2 2026-03-24T17:22:56.863 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'rbd2 rbd2' = 'rbd2 rbd2' 2026-03-24T17:22:56.864 INFO:tasks.workunit.client.0.vm01.stderr:+++ rbd trash purge schedule ls -R --format xml 2026-03-24T17:22:56.864 INFO:tasks.workunit.client.0.vm01.stderr:+++ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-24T17:22:56.890 INFO:tasks.workunit.client.0.vm01.stderr:++ echo 2d00:17:00 1d01:30:00 2026-03-24T17:22:56.890 INFO:tasks.workunit.client.0.vm01.stderr:+ test '2d00:17:00 1d01:30:00' = '2d00:17:00 1d01:30:00' 2026-03-24T17:22:56.890 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add 1d 2026-03-24T17:22:56.920 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:22:56.920 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:22:56.944 INFO:tasks.workunit.client.0.vm01.stdout:every 1d, every 2d starting at 00:17:00 2026-03-24T17:22:56.944 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:22:56.944 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1d' 2026-03-24T17:22:56.967 INFO:tasks.workunit.client.0.vm01.stdout:every 1d, every 2d starting at 00:17:00 2026-03-24T17:22:56.968 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -R --format xml 2026-03-24T17:22:56.968 INFO:tasks.workunit.client.0.vm01.stderr:+ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-24T17:22:56.968 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 2d00:17 2026-03-24T17:22:56.994 INFO:tasks.workunit.client.0.vm01.stdout:1d2d00:17:00 2026-03-24T17:22:56.995 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule rm 1d 2026-03-24T17:22:57.024 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:22:57.024 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T17:22:57.050 INFO:tasks.workunit.client.0.vm01.stdout:every 2d starting at 00:17:00 2026-03-24T17:22:57.050 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule rm 2d 00:17 2026-03-24T17:22:57.078 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T17:22:57.079 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:22:57.105 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:22:57.105 INFO:tasks.workunit.client.0.vm01.stderr:+ for p in rbd2 rbd2/ns1 2026-03-24T17:22:57.105 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T17:22:57.137 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-24T17:22:57.184 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:22:57.184 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:22:57.184 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:22:57.208 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:22:57.209 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd2 1m 2026-03-24T17:22:57.237 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T17:22:57.237 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:22:57.262 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 - every 1m 2026-03-24T17:22:57.263 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T17:22:57.263 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:22:57.289 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 - every 1m 2026-03-24T17:22:57.289 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:22:57.290 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:22:57.290 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:22:57.290 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:22:57.290 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:22:57.313 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:22:57.313 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:07.314 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:07.314 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:07.314 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:07.315 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:07.339 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:23:07.339 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:07.339 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:07.339 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:23:07.362 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:23:07.362 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T17:23:07.362 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:23:07.388 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 - every 1m 2026-03-24T17:23:07.388 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T17:23:07.388 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:23:07.413 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 - every 1m 2026-03-24T17:23:07.413 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status 2026-03-24T17:23:07.413 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:23:07.436 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:24:00 2026-03-24T17:23:07.437 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-24T17:23:07.437 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:23:07.462 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:24:00 2026-03-24T17:23:07.463 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-24T17:23:07.463 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:23:07.487 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:24:00 2026-03-24T17:23:07.487 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule rm -p rbd2 1m 2026-03-24T17:23:07.512 INFO:tasks.workunit.client.0.vm01.stderr:+ for p in rbd2 rbd2/ns1 2026-03-24T17:23:07.512 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T17:23:07.541 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-24T17:23:07.576 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:07.576 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:07.576 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:07.599 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:07.599 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd2/ns1 1m 2026-03-24T17:23:07.623 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T17:23:07.623 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:23:07.647 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 every 1m 2026-03-24T17:23:07.648 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T17:23:07.648 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:23:07.673 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 every 1m 2026-03-24T17:23:07.673 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:23:07.674 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:07.674 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:07.674 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:07.674 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:07.697 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:07.698 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:17.699 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:17.699 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:17.699 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:17.699 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:17.722 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:17.722 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:27.724 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:27.724 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:27.724 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:27.724 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:27.942 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:23:27.939+0000 7f4ed24b3640 0 --2- 192.168.123.101:0/799510619 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x55ee669c6860 0x55ee66a87300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:23:27.950 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:27.950 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:37.952 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:37.952 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:37.952 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:37.952 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:37.977 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:37.977 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:47.979 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:47.979 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:47.979 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:47.979 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:48.002 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:48.002 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:23:58.003 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:23:58.004 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:23:58.004 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:23:58.004 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:23:58.028 INFO:tasks.workunit.client.0.vm01.stdout:1 2026-03-24T17:23:58.028 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:24:08.030 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:24:08.030 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:24:08.030 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:24:08.030 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^1$' 2026-03-24T17:24:08.054 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:24:08.054 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T17:24:08.054 INFO:tasks.workunit.client.0.vm01.stderr:+ wc -l 2026-03-24T17:24:08.054 INFO:tasks.workunit.client.0.vm01.stderr:+ grep '^0$' 2026-03-24T17:24:08.077 INFO:tasks.workunit.client.0.vm01.stdout:0 2026-03-24T17:24:08.077 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T17:24:08.077 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:24:08.102 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 every 1m 2026-03-24T17:24:08.102 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T17:24:08.102 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:24:08.127 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 every 1m 2026-03-24T17:24:08.128 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status 2026-03-24T17:24:08.128 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:24:08.152 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:25:00 2026-03-24T17:24:08.152 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-24T17:24:08.152 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:24:08.176 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:25:00 2026-03-24T17:24:08.176 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-24T17:24:08.177 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1' 2026-03-24T17:24:08.200 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 2026-03-24 17:25:00 2026-03-24T17:24:08.200 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 1m 2026-03-24T17:24:08.227 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add 2m 2026-03-24T17:24:08.257 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule add -p rbd dummy 2026-03-24T17:24:08.258 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd dummy 2026-03-24T17:24:08.279 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.275+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:24:08.279 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:24:08.282 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.282 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule add -p rbd 1d dummy 2026-03-24T17:24:08.282 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd 1d dummy 2026-03-24T17:24:08.304 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.299+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.304 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.307 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.307 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule add dummy 2026-03-24T17:24:08.307 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add dummy 2026-03-24T17:24:08.328 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.323+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:24:08.329 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:24:08.331 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.331 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule add 1d dummy 2026-03-24T17:24:08.331 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add 1d dummy 2026-03-24T17:24:08.352 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.347+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.352 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.355 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.355 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-24T17:24:08.355 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-24T17:24:08.376 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.371+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:24:08.377 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:24:08.379 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.379 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T17:24:08.379 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T17:24:08.402 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.399+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.402 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.405 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.405 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-24T17:24:08.405 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove dummy 2026-03-24T17:24:08.426 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.423+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:24:08.426 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:24:08.429 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.429 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-24T17:24:08.429 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-24T17:24:08.450 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:08.447+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.450 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:24:08.453 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:08.453 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-24T17:24:08.453 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T17:24:08.477 INFO:tasks.workunit.client.0.vm01.stdout:every 1d starting at 01:30:00 2026-03-24T17:24:08.478 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls 2026-03-24T17:24:08.478 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2m' 2026-03-24T17:24:08.501 INFO:tasks.workunit.client.0.vm01.stdout:every 2m 2026-03-24T17:24:08.501 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd 1d 01:30 2026-03-24T17:24:08.531 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove 2m 2026-03-24T17:24:08.560 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-24T17:24:08.585 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:24:08.585 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:24:08.585 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:08.663 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:08.737 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:08.811 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:08.880 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:08.952 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.026 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.100 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.174 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.247 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.525 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.600 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.677 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.751 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.825 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.899 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:09.972 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.046 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.120 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:24:10.614 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:24:10.628 INFO:tasks.workunit.client.0.vm01.stdout:testing recovery of trash_purge_schedule handler after module's RADOS client is blocklisted... 2026-03-24T17:24:10.628 INFO:tasks.workunit.client.0.vm01.stderr:+ test_trash_purge_schedule_recovery 2026-03-24T17:24:10.628 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing recovery of trash_purge_schedule handler after module'\''s RADOS client is blocklisted...' 2026-03-24T17:24:10.628 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:24:10.628 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.707 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.781 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.869 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:10.946 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.022 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.096 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.170 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.246 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.321 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.397 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.474 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.751 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.828 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.903 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:11.976 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:12.047 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:12.123 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:24:12.198 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd3 8 2026-03-24T17:24:12.619 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' already exists 2026-03-24T17:24:12.631 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd3 2026-03-24T17:24:15.594 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd3/ns1 2026-03-24T17:24:15.619 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3/ns1 2d 2026-03-24T17:24:15.663 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T17:24:15.663 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T17:24:15.689 INFO:tasks.workunit.client.0.vm01.stdout:rbd3 ns1 every 2d 2026-03-24T17:24:15.690 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph mgr dump 2026-03-24T17:24:15.690 INFO:tasks.workunit.client.0.vm01.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T17:24:15.690 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.active_clients[]' 2026-03-24T17:24:15.690 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T17:24:15.954 INFO:tasks.workunit.client.0.vm01.stderr:+ CLIENT_ADDR=192.168.123.101:0/1914320998 2026-03-24T17:24:15.954 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd blocklist add 192.168.123.101:0/1914320998 2026-03-24T17:24:17.593 INFO:tasks.workunit.client.0.vm01.stderr:blocklisting 192.168.123.101:0/1914320998 until 2026-03-24T18:24:16.639370+0000 (3600 sec) 2026-03-24T17:24:17.608 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:17.608 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:17.631 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:17.627+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-24T17:24:17.631 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-24T17:24:17.634 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:24:17.634 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:24:27.635 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 24 2026-03-24T17:24:27.636 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:24:27.636 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:27.658 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:27.655+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:24:27.658 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:24:27.660 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:24:37.662 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:24:37.662 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:37.685 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:37.679+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:24:37.685 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:24:37.689 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:24:47.690 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:24:47.690 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:47.712 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:47.707+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:24:47.712 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:24:47.715 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:24:57.716 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:24:57.716 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:24:57.738 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:24:57.735+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:24:57.738 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:24:57.741 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:25:07.743 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:25:07.743 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T17:25:07.770 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:25:07.770 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T17:25:07.770 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 10m' 2026-03-24T17:25:07.796 INFO:tasks.workunit.client.0.vm01.stdout:rbd3 - every 10m 2026-03-24T17:25:07.796 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T17:25:07.796 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T17:25:07.820 INFO:tasks.workunit.client.0.vm01.stdout:rbd3 ns1 every 2d 2026-03-24T17:25:07.820 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd3 10m 2026-03-24T17:25:07.846 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule remove -p rbd3/ns1 2d 2026-03-24T17:25:07.873 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T17:25:07.873 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'every 10m' 2026-03-24T17:25:07.873 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 10m' 2026-03-24T17:25:07.900 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:07.900 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T17:25:07.900 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'rbd3 *ns1 *every 2d' 2026-03-24T17:25:07.900 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T17:25:07.925 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:07.925 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T17:25:08.293 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' does not exist 2026-03-24T17:25:08.305 INFO:tasks.workunit.client.0.vm01.stdout:testing mirror snapshot schedule... 2026-03-24T17:25:08.305 INFO:tasks.workunit.client.0.vm01.stderr:+ test_mirror_snapshot_schedule 2026-03-24T17:25:08.305 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing mirror snapshot schedule...' 2026-03-24T17:25:08.305 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:25:08.305 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:08.583 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:08.660 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:08.735 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:08.811 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.089 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.166 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.248 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.325 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.399 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.475 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.549 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.626 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.702 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.776 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.850 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:09.925 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:10.002 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:25:10.080 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:25:10.303 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:25:10.318 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:25:13.268 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T17:25:13.297 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd2 image 2026-03-24T17:25:13.324 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd2/ns1 image 2026-03-24T17:25:13.351 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool peer add rbd2 cluster1 2026-03-24T17:25:13.372 INFO:tasks.workunit.client.0.vm01.stdout:419cd337-10ee-4e84-af67-0a87c80737f4 2026-03-24T17:25:13.375 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd mirror snapshot schedule list 2026-03-24T17:25:13.620 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{}' = '{}' 2026-03-24T17:25:13.620 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd mirror snapshot schedule status 2026-03-24T17:25:13.620 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep '"scheduled_images": []' 2026-03-24T17:25:13.866 INFO:tasks.workunit.client.0.vm01.stdout: "scheduled_images": [] 2026-03-24T17:25:13.867 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T17:25:13.867 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T17:25:13.890 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:13.890 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-24T17:25:13.914 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:25:13.914 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T17:25:13.943 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:13.943 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:13.967 INFO:tasks.workunit.client.0.vm01.stderr:rbd: mirroring not enabled on the image 2026-03-24T17:25:13.971 INFO:tasks.workunit.client.0.vm01.stderr:+ test 0 = 0 2026-03-24T17:25:13.971 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror image enable rbd2/ns1/test1 snapshot 2026-03-24T17:25:14.276 INFO:tasks.workunit.client.0.vm01.stdout:Mirroring enabled 2026-03-24T17:25:14.283 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:14.283 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:14.315 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 = 1 2026-03-24T17:25:14.315 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-24T17:25:14.315 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-24T17:25:14.335 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:25:14.331+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:25:14.335 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:25:14.338 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.338 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-24T17:25:14.338 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-24T17:25:14.358 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:25:14.355+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:25:14.358 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:25:14.360 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.360 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T17:25:14.360 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T17:25:14.390 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:25:14.387+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:25:14.390 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:25:14.395 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.395 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:25:14.395 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:25:14.425 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:25:14.423+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:25:14.425 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:25:14.429 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.429 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1m 2026-03-24T17:25:14.463 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T17:25:14.463 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T17:25:14.486 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.486 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T17:25:14.486 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:25:14.509 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:25:14.510 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:25:14.510 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:25:14.535 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.535 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T17:25:14.535 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:25:14.559 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:25:14.559 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:25:14.559 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:25:14.585 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:25:14.585 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T17:25:14.585 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:25:14.610 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:25:14.610 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T17:25:14.639 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T17:25:14.639 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:25:14.640 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:25:14.640 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:14.640 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:14.731 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 -gt 1 2026-03-24T17:25:14.731 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:25:24.732 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:25:24.732 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:24.732 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:24.768 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 -gt 1 2026-03-24T17:25:24.768 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:25:34.769 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:25:34.769 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:34.769 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:34.804 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 -gt 1 2026-03-24T17:25:34.804 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:25:44.805 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:25:44.805 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:44.806 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:44.838 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 -gt 1 2026-03-24T17:25:44.838 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:25:54.839 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:25:54.840 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:25:54.840 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:25:54.873 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 -gt 1 2026-03-24T17:25:54.873 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:04.874 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:04.874 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:26:04.874 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:26:04.909 INFO:tasks.workunit.client.0.vm01.stderr:+ test 2 -gt 1 2026-03-24T17:26:04.909 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:26:04.909 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T17:26:04.909 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:26:04.939 INFO:tasks.workunit.client.0.vm01.stderr:+ test 2 -gt 1 2026-03-24T17:26:04.939 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T17:26:04.939 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T17:26:04.963 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:26:04.963 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T17:26:04.964 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:26:04.986 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:26:04.987 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:26:04.987 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:26:05.011 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:26:05.011 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T17:26:05.011 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:26:05.035 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:26:05.035 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:26:05.035 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:26:05.061 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:26:05.061 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T17:26:05.061 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:26:05.086 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:26:05.086 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T17:26:05.115 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T17:26:05.115 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:05.134 INFO:tasks.workunit.client.0.vm01.stdout:SCHEDULE TIME IMAGE 2026-03-24T17:26:05.134 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:05.137 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule status --format xml 2026-03-24T17:26:05.137 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T17:26:05.160 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T17:26:05.161 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule status -p rbd2 --format xml 2026-03-24T17:26:05.161 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T17:26:05.184 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T17:26:05.184 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --format xml 2026-03-24T17:26:05.184 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T17:26:05.210 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T17:26:05.210 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --image test1 --format xml 2026-03-24T17:26:05.210 INFO:tasks.workunit.client.0.vm01.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T17:26:05.242 INFO:tasks.workunit.client.0.vm01.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T17:26:05.242 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror image demote rbd2/ns1/test1 2026-03-24T17:26:05.391 INFO:tasks.workunit.client.0.vm01.stdout:Image demoted to non-primary 2026-03-24T17:26:05.395 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:26:05.396 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:05.396 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:05.396 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:05.422 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:05.428 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:15.423 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:15.423 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:15.423 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:15.447 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:15.447 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:25.448 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:25.448 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:25.448 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:25.472 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:25.472 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:35.473 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:35.473 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:35.473 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:35.499 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:35.499 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:45.500 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:45.500 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:45.500 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:45.523 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:45.524 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:26:55.525 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:26:55.525 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:26:55.525 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:26:55.550 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:27:00 rbd2/ns1/test1 2026-03-24T17:26:55.551 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:05.552 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:05.552 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:05.552 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:05.578 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:27:05.578 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:05.578 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-24T17:27:05.578 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:05.601 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:27:05.601 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror image promote rbd2/ns1/test1 2026-03-24T17:27:06.265 INFO:tasks.workunit.client.0.vm01.stdout:Image promoted to primary 2026-03-24T17:27:06.271 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:27:06.272 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:06.272 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:06.272 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:06.297 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:16.298 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:16.299 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:16.299 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:16.322 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:26.323 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:26.323 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:26.323 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:26.346 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:36.348 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:36.348 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:36.348 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:36.372 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:46.373 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:46.373 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:46.378 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:46.396 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:27:56.397 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:27:56.398 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:27:56.398 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:27:56.421 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:28:06.422 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:06.422 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:06.422 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:06.447 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:06.447 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:28:06.447 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:06.447 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:06.469 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:06.469 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add 1h 00:15 2026-03-24T17:28:06.501 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls 2026-03-24T17:28:06.524 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-24T17:28:06.524 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T17:28:06.524 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T17:28:06.548 INFO:tasks.workunit.client.0.vm01.stdout:- - - every 1h starting at 00:15:00 2026-03-24T17:28:06.548 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T17:28:06.548 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:28:06.572 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:28:06.572 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:28:06.572 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T17:28:06.598 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.599 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T17:28:06.599 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T17:28:06.623 INFO:tasks.workunit.client.0.vm01.stdout:- - - every 1h starting at 00:15:00 2026-03-24T17:28:06.624 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T17:28:06.624 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:28:06.649 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:28:06.649 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:28:06.649 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T17:28:06.675 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.675 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T17:28:06.675 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T17:28:06.701 INFO:tasks.workunit.client.0.vm01.stdout:- - - every 1h starting at 00:15:00 2026-03-24T17:28:06.702 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T17:28:06.702 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T17:28:06.728 INFO:tasks.workunit.client.0.vm01.stdout:rbd2 ns1 test1 every 1m 2026-03-24T17:28:06.729 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T17:28:06.761 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T17:28:06.761 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule add dummy 2026-03-24T17:28:06.761 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add dummy 2026-03-24T17:28:06.783 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.779+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:28:06.783 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:28:06.786 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.786 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule add 1h dummy 2026-03-24T17:28:06.786 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add 1h dummy 2026-03-24T17:28:06.808 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.803+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.809 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.812 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.812 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-24T17:28:06.812 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-24T17:28:06.841 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.835+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:28:06.841 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:28:06.844 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.844 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:28:06.844 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:28:06.873 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.867+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.873 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.876 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.876 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-24T17:28:06.876 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-24T17:28:06.896 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.891+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:28:06.897 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:28:06.899 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.899 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-24T17:28:06.899 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-24T17:28:06.919 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.915+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.919 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.922 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.922 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T17:28:06.922 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T17:28:06.952 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.947+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T17:28:06.952 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T17:28:06.954 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.954 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:28:06.954 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T17:28:06.982 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:28:06.979+0000 7f1bd5681640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.982 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T17:28:06.985 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:28:06.985 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls 2026-03-24T17:28:07.006 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-24T17:28:07.007 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T17:28:07.036 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T17:28:07.036 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd2/ns1/test1 2026-03-24T17:28:09.514 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:28:09.511+0000 7f4aac2b7640 0 -- 192.168.123.101:0/2581290434 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x55e3b3f17910 msgr2=0x55e3b3f49b70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:28:10.521 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:28:10.515+0000 7f4aac2b7640 0 -- 192.168.123.101:0/2581290434 >> [v2:192.168.123.101:6800/4203820349,v1:192.168.123.101:6801/4203820349] conn(0x7f4a8c05d1b0 msgr2=0x7f4a8c07d590 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:28:11.539 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:28:11.542 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 12 2026-03-24T17:28:11.543 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:11.543 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:11.543 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:11.566 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:11.566 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:28:21.567 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:21.567 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:21.567 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:21.605 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:21.606 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:28:31.607 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:31.607 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:31.607 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:31.639 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:31.639 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:28:41.640 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:41.641 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:41.641 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:41.665 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:41.666 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:28:51.667 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:28:51.667 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:28:51.667 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:28:51.690 INFO:tasks.workunit.client.0.vm01.stdout:2026-03-24 17:29:00 rbd2/ns1/test1 2026-03-24T17:28:51.690 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:29:01.691 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 12` 2026-03-24T17:29:01.691 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:29:01.691 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:29:01.714 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:29:01.714 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule status 2026-03-24T17:29:01.714 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-24T17:29:01.714 INFO:tasks.workunit.client.0.vm01.stderr:+ grep rbd2/ns1/test1 2026-03-24T17:29:01.735 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:29:01.735 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule remove 2026-03-24T17:29:01.765 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-24T17:29:01.788 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:29:01.789 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:29:01.789 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:01.861 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:01.934 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.040 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.113 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.186 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.260 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.335 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.412 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.484 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.559 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.634 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.708 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.785 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.860 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:02.931 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:03.005 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:03.080 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:03.155 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:29:04.342 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:29:04.354 INFO:tasks.workunit.client.0.vm01.stdout:testing recovery of mirror snapshot scheduler after module's RADOS client is blocklisted... 2026-03-24T17:29:04.354 INFO:tasks.workunit.client.0.vm01.stderr:+ test_mirror_snapshot_schedule_recovery 2026-03-24T17:29:04.354 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing recovery of mirror snapshot scheduler after module'\''s RADOS client is blocklisted...' 2026-03-24T17:29:04.354 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:29:04.354 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.430 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.505 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.578 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.650 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.725 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.799 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.871 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:04.949 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.023 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.099 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.375 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.451 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.524 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.599 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.671 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.746 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.822 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:29:05.897 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd3 8 2026-03-24T17:29:06.346 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' already exists 2026-03-24T17:29:06.357 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd3 2026-03-24T17:29:09.309 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd3/ns1 2026-03-24T17:29:09.333 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd3 image 2026-03-24T17:29:09.357 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd3/ns1 image 2026-03-24T17:29:09.381 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool peer add rbd3 cluster1 2026-03-24T17:29:09.402 INFO:tasks.workunit.client.0.vm01.stdout:55861696-0f5f-4cfc-a569-a20f5f13f2d8 2026-03-24T17:29:09.404 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 -s 1 rbd3/ns1/test1 2026-03-24T17:29:09.432 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror image enable rbd3/ns1/test1 snapshot 2026-03-24T17:29:10.316 INFO:tasks.workunit.client.0.vm01.stdout:Mirroring enabled 2026-03-24T17:29:10.322 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror image status rbd3/ns1/test1 2026-03-24T17:29:10.322 INFO:tasks.workunit.client.0.vm01.stderr:++ grep -c mirror.primary 2026-03-24T17:29:10.352 INFO:tasks.workunit.client.0.vm01.stderr:+ test 1 = 1 2026-03-24T17:29:10.352 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 1m 2026-03-24T17:29:10.383 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T17:29:10.412 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T17:29:10.412 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph mgr dump 2026-03-24T17:29:10.412 INFO:tasks.workunit.client.0.vm01.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T17:29:10.412 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.active_clients[]' 2026-03-24T17:29:10.412 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T17:29:10.670 INFO:tasks.workunit.client.0.vm01.stderr:+ CLIENT_ADDR=192.168.123.101:0/2911768874 2026-03-24T17:29:10.670 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd blocklist add 192.168.123.101:0/2911768874 2026-03-24T17:29:12.312 INFO:tasks.workunit.client.0.vm01.stderr:blocklisting 192.168.123.101:0/2911768874 until 2026-03-24T18:29:11.361785+0000 (3600 sec) 2026-03-24T17:29:12.326 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:12.326 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:12.348 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:12.343+0000 7f1bd5681640 -1 librbd::api::Namespace: list: error listing namespaces: (108) Cannot send after transport endpoint shutdown 2026-03-24T17:29:12.348 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:12.343+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-24T17:29:12.348 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-24T17:29:12.350 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:29:12.351 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:29:22.352 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 24 2026-03-24T17:29:22.353 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:29:22.353 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:22.375 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:22.371+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:29:22.375 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:29:22.377 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:29:32.378 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:29:32.378 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:32.401 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:32.395+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:29:32.401 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:29:32.403 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:29:42.405 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:29:42.405 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:42.430 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:42.427+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:29:42.430 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:29:42.433 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:29:52.434 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:29:52.434 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:29:52.457 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:29:52.451+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:29:52.457 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:29:52.459 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:30:02.461 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:30:02.461 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:30:02.483 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:30:02.479+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:30:02.483 INFO:tasks.workunit.client.0.vm01.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:30:02.485 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:30:12.486 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:30:12.486 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T17:30:12.519 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:30:12.519 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T17:30:12.519 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2m' 2026-03-24T17:30:12.549 INFO:tasks.workunit.client.0.vm01.stdout:every 2m, every 1m 2026-03-24T17:30:12.549 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T17:30:12.549 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:30:12.579 INFO:tasks.workunit.client.0.vm01.stdout:every 2m, every 1m 2026-03-24T17:30:12.580 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 2m 2026-03-24T17:30:12.611 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 1m 2026-03-24T17:30:12.643 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T17:30:12.643 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'every 2m' 2026-03-24T17:30:12.643 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 2m' 2026-03-24T17:30:12.674 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:30:12.674 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T17:30:12.674 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'every 1m' 2026-03-24T17:30:12.674 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'every 1m' 2026-03-24T17:30:12.705 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:30:12.705 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap purge rbd3/ns1/test1 2026-03-24T17:30:12.733 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd rm rbd3/ns1/test1 2026-03-24T17:30:13.349 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:30:13.343+0000 7f37c7c55640 0 -- 192.168.123.101:0/1416286867 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x7f37a805c6a0 msgr2=0x7f37a807cac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:30:13.354 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:30:13.351+0000 7f37c61cb640 0 -- 192.168.123.101:0/1416286867 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7f37a4007640 msgr2=0x7f37a4007a60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:30:13.360 INFO:tasks.workunit.client.0.vm01.stderr: Removing image: 100% complete...done. 2026-03-24T17:30:13.364 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T17:30:14.392 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' does not exist 2026-03-24T17:30:14.404 INFO:tasks.workunit.client.0.vm01.stdout:testing perf image iostat... 2026-03-24T17:30:14.404 INFO:tasks.workunit.client.0.vm01.stderr:+ test_perf_image_iostat 2026-03-24T17:30:14.404 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing perf image iostat...' 2026-03-24T17:30:14.404 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:30:14.404 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:14.482 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:14.760 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:14.837 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:14.915 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:14.995 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.073 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.149 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.229 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.507 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.786 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.864 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:15.942 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.019 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.095 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.171 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.247 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.324 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:30:16.403 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd1 8 2026-03-24T17:30:17.404 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd1' already exists 2026-03-24T17:30:17.417 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd1 2026-03-24T17:30:20.368 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd1/ns 2026-03-24T17:30:20.394 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:30:21.425 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:30:21.439 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:30:24.385 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd2/ns 2026-03-24T17:30:24.413 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_SPECS=("test1" "rbd1/test2" "rbd1/ns/test3" "rbd2/test4" "rbd2/ns/test5") 2026-03-24T17:30:24.413 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.413 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' test1 2026-03-24T17:30:24.449 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.449 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/test2 2026-03-24T17:30:24.480 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.480 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/ns/test3 2026-03-24T17:30:24.509 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.509 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/test4 2026-03-24T17:30:24.539 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.539 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/ns/test5 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS=() 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false test1 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/test2 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/ns/test3 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:30:24.571 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/ns/test5 2026-03-24T17:30:24.573 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:30:24.575 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/test4 2026-03-24T17:30:24.581 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd1 2026-03-24T17:30:24.622 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:30:24.622 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:30:34.630 INFO:tasks.workunit.client.0.vm01.stderr:+ test test2 = test2 2026-03-24T17:30:34.630 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd1/ns 2026-03-24T17:30:34.631 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:30:34.700 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:30:34.700 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:30:44.712 INFO:tasks.workunit.client.0.vm01.stderr:+ test test3 = test3 2026-03-24T17:30:44.713 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd1 /ns 2026-03-24T17:30:44.713 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:30:44.780 INFO:tasks.workunit.client.0.vm01.stderr:+ test test3 = test3 2026-03-24T17:30:44.780 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json --pool rbd2 2026-03-24T17:30:44.780 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:30:44.851 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:30:44.851 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:30:54.858 INFO:tasks.workunit.client.0.vm01.stderr:+ test test4 = test4 2026-03-24T17:30:54.858 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json --pool rbd2 --namespace ns 2026-03-24T17:30:54.859 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:30:54.915 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:30:54.915 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:31:04.925 INFO:tasks.workunit.client.0.vm01.stderr:+ test test5 = test5 2026-03-24T17:31:04.925 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd2 --namespace ns 2026-03-24T17:31:04.925 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:31:05.005 INFO:tasks.workunit.client.0.vm01.stderr:+ test test5 = test5 2026-03-24T17:31:05.005 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:31:05.007 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json 2026-03-24T17:31:05.090 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:31:05.090 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ test 'test1 test2 test3 test4 test5' = 'test1 test2 test3 test4 test5' 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 82254 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 82255 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 82256 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 82257 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:31:20.102 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 82258 2026-03-24T17:31:20.103 INFO:tasks.workunit.client.0.vm01.stderr:+ wait 2026-03-24T17:31:20.143 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:31:20.143 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.224 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.302 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.378 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.454 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.534 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.611 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.690 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.770 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.847 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:20.953 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:21.029 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:21.106 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:21.182 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:24.141 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:24.221 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:24.301 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:24.381 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:24.465 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:31:25.140 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:31:25.153 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-24T17:31:26.038 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd1' does not exist 2026-03-24T17:31:26.067 INFO:tasks.workunit.client.0.vm01.stdout:testing recovery of perf handler after module's RADOS client is blocklisted... 2026-03-24T17:31:26.067 INFO:tasks.workunit.client.0.vm01.stderr:+ test_perf_image_iostat_recovery 2026-03-24T17:31:26.067 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing recovery of perf handler after module'\''s RADOS client is blocklisted...' 2026-03-24T17:31:26.067 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:31:26.067 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.181 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.263 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.343 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.424 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.509 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.592 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.675 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.759 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.840 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:26.926 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.007 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.089 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.374 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.454 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.534 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.614 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.696 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:31:27.780 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd3 8 2026-03-24T17:31:28.965 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' already exists 2026-03-24T17:31:28.977 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd3 2026-03-24T17:31:31.932 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd namespace create rbd3/ns 2026-03-24T17:31:31.959 INFO:tasks.workunit.client.0.vm01.stderr:+ IMAGE_SPECS=("rbd3/test1" "rbd3/ns/test2") 2026-03-24T17:31:31.959 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:31:31.959 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/test1 2026-03-24T17:31:31.989 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:31:31.989 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/ns/test2 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS=() 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/test1 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ BENCH_PIDS+=($!) 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/ns/test2 2026-03-24T17:31:32.020 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd3 2026-03-24T17:31:32.021 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:31:32.049 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:31:32.049 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:31:42.057 INFO:tasks.workunit.client.0.vm01.stderr:+ test test1 = test1 2026-03-24T17:31:42.058 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph mgr dump 2026-03-24T17:31:42.058 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.active_clients[]' 2026-03-24T17:31:42.059 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T17:31:42.072 INFO:tasks.workunit.client.0.vm01.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T17:31:42.392 INFO:tasks.workunit.client.0.vm01.stderr:+ CLIENT_ADDR=192.168.123.101:0/3800234888 2026-03-24T17:31:42.392 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd blocklist add 192.168.123.101:0/3800234888 2026-03-24T17:31:44.161 INFO:tasks.workunit.client.0.vm01.stderr:blocklisting 192.168.123.101:0/3800234888 until 2026-03-24T18:31:43.231241+0000 (3600 sec) 2026-03-24T17:31:44.178 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd perf image iostat --format json rbd3/ns 2026-03-24T17:31:44.178 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd perf image iostat --format json rbd3/ns 2026-03-24T17:31:44.210 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:31:44.210 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:31:49.212 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:31:49.207+0000 7f1bd1e7a640 -1 librbd::api::Image: list_images_v2: error listing image in directory: (108) Cannot send after transport endpoint shutdown 2026-03-24T17:31:49.212 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:31:49.207+0000 7f1bd1e7a640 -1 librbd::api::Image: list_images: error listing v2 images: (108) Cannot send after transport endpoint shutdown 2026-03-24T17:31:49.213 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:31:49.207+0000 7f1bd5681640 -1 mgr.server reply reply (2) No such file or directory '309267baabef' 2026-03-24T17:31:49.213 INFO:tasks.workunit.client.0.vm01.stderr:rbd: mgr command failed: (2) No such file or directory: '309267baabef' 2026-03-24T17:31:49.216 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:31:49.216 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:31:59.219 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 24 2026-03-24T17:31:59.222 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:31:59.223 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T17:31:59.223 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:31:59.272 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:31:59.267+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:31:59.275 INFO:tasks.workunit.client.0.vm01.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:31:59.303 INFO:tasks.workunit.client.0.vm01.stderr:+ test '' = test2 2026-03-24T17:31:59.303 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:32:09.304 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:32:09.305 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T17:32:09.306 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:32:09.346 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:32:09.343+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:32:09.346 INFO:tasks.workunit.client.0.vm01.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T17:32:09.351 INFO:tasks.workunit.client.0.vm01.stderr:+ test '' = test2 2026-03-24T17:32:09.352 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:32:19.353 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:32:19.356 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T17:32:19.357 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T17:32:19.400 INFO:tasks.workunit.client.0.vm01.stderr:rbd: waiting for initial image stats 2026-03-24T17:32:19.400 INFO:tasks.workunit.client.0.vm01.stderr: 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ test test2 = test2 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 84183 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ kill 84184 2026-03-24T17:32:29.410 INFO:tasks.workunit.client.0.vm01.stderr:+ wait 2026-03-24T17:32:29.434 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:32:29.434 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.516 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.594 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.672 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.751 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.831 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.914 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:29.997 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.091 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.174 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.258 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.341 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.422 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.503 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.584 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.664 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.742 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.825 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:30.908 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T17:32:31.297 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd3' does not exist 2026-03-24T17:32:31.311 INFO:tasks.workunit.client.0.vm01.stderr:+ test_mirror_pool_peer_bootstrap_create 2026-03-24T17:32:31.311 INFO:tasks.workunit.client.0.vm01.stdout:testing mirror pool peer bootstrap create... 2026-03-24T17:32:31.311 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing mirror pool peer bootstrap create...' 2026-03-24T17:32:31.311 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:32:31.312 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:31.623 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:31.703 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:31.783 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:31.862 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:31.942 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.024 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.105 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.188 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.273 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.571 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.651 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.735 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.821 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.903 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:32.987 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:33.310 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:33.397 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:33.474 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd1 8 2026-03-24T17:32:34.288 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd1' already exists 2026-03-24T17:32:34.300 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd1 2026-03-24T17:32:37.259 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd1 image 2026-03-24T17:32:37.284 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:32:38.310 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:32:38.322 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:32:41.529 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd mirror pool enable rbd2 pool 2026-03-24T17:32:41.558 INFO:tasks.workunit.client.0.vm01.stderr:+ readarray -t MON_ADDRS 2026-03-24T17:32:41.558 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph mon dump 2026-03-24T17:32:41.558 INFO:tasks.workunit.client.0.vm01.stderr:++ sed -n 's/^[0-9]: \(.*\) mon\.[a-z]$/\1/p' 2026-03-24T17:32:41.818 INFO:tasks.workunit.client.0.vm01.stderr:dumped monmap epoch 1 2026-03-24T17:32:41.831 INFO:tasks.workunit.client.0.vm01.stderr:+ BAD_MON_ADDR=1.2.3.4:6789 2026-03-24T17:32:41.831 INFO:tasks.workunit.client.0.vm01.stderr:+ MON_HOST='[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0],1.2.3.4:6789' 2026-03-24T17:32:41.831 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0],1.2.3.4:6789' rbd1 2026-03-24T17:32:41.831 INFO:tasks.workunit.client.0.vm01.stderr:++ base64 -d 2026-03-24T17:32:41.859 INFO:tasks.workunit.client.0.vm01.stderr:+ TOKEN='{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' 2026-03-24T17:32:41.859 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r .fsid 2026-03-24T17:32:41.868 INFO:tasks.workunit.client.0.vm01.stderr:+ TOKEN_FSID=dc403d64-7ddd-4e06-90c8-8f9c41489fa2 2026-03-24T17:32:41.868 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r .client_id 2026-03-24T17:32:41.877 INFO:tasks.workunit.client.0.vm01.stderr:+ TOKEN_CLIENT_ID=rbd-mirror-peer 2026-03-24T17:32:41.877 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r .key 2026-03-24T17:32:41.887 INFO:tasks.workunit.client.0.vm01.stderr:+ TOKEN_KEY=AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w== 2026-03-24T17:32:41.887 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r .mon_host 2026-03-24T17:32:41.896 INFO:tasks.workunit.client.0.vm01.stderr:+ TOKEN_MON_HOST='[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]' 2026-03-24T17:32:41.896 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph fsid 2026-03-24T17:32:42.164 INFO:tasks.workunit.client.0.vm01.stderr:+ test dc403d64-7ddd-4e06-90c8-8f9c41489fa2 = dc403d64-7ddd-4e06-90c8-8f9c41489fa2 2026-03-24T17:32:42.164 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph auth get-key client.rbd-mirror-peer 2026-03-24T17:32:42.429 INFO:tasks.workunit.client.0.vm01.stderr:+ test AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w== = AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w== 2026-03-24T17:32:42.429 INFO:tasks.workunit.client.0.vm01.stderr:+ for addr in "${MON_ADDRS[@]}" 2026-03-24T17:32:42.429 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep '[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]' 2026-03-24T17:32:42.432 INFO:tasks.workunit.client.0.vm01.stdout:[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] 2026-03-24T17:32:42.432 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail fgrep 1.2.3.4:6789 2026-03-24T17:32:42.432 INFO:tasks.workunit.client.0.vm01.stderr:+ fgrep 1.2.3.4:6789 2026-03-24T17:32:42.433 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:32:42.434 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0],1.2.3.4:6789' rbd1 2026-03-24T17:32:42.434 INFO:tasks.workunit.client.0.vm01.stderr:++ base64 -d 2026-03-24T17:32:42.458 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' = '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' 2026-03-24T17:32:42.458 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror pool peer bootstrap create rbd1 2026-03-24T17:32:42.458 INFO:tasks.workunit.client.0.vm01.stderr:++ base64 -d 2026-03-24T17:32:42.481 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' = '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' 2026-03-24T17:32:42.482 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0],1.2.3.4:6789' rbd2 2026-03-24T17:32:42.482 INFO:tasks.workunit.client.0.vm01.stderr:++ base64 -d 2026-03-24T17:32:42.505 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' = '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' 2026-03-24T17:32:42.505 INFO:tasks.workunit.client.0.vm01.stderr:++ rbd mirror pool peer bootstrap create rbd2 2026-03-24T17:32:42.505 INFO:tasks.workunit.client.0.vm01.stderr:++ base64 -d 2026-03-24T17:32:42.531 INFO:tasks.workunit.client.0.vm01.stderr:+ test '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' = '{"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","client_id":"rbd-mirror-peer","key":"AQC5ysJp7JzvMhAAyMjhiRs813ppf6rtgRd7/w==","mon_host":"[v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0]"}' 2026-03-24T17:32:42.531 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:32:43.578 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:32:43.590 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-24T17:32:44.582 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd1' does not exist 2026-03-24T17:32:44.595 INFO:tasks.workunit.client.0.vm01.stdout:testing removing pool under running tasks... 2026-03-24T17:32:44.595 INFO:tasks.workunit.client.0.vm01.stderr:+ test_tasks_removed_pool 2026-03-24T17:32:44.595 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing removing pool under running tasks...' 2026-03-24T17:32:44.595 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:32:44.595 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:44.679 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:44.758 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:44.839 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:44.920 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.001 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.082 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.162 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.646 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.725 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.893 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:45.975 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.055 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.138 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.225 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.305 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.387 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.467 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:32:46.551 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:32:47.597 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:32:47.610 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:32:50.569 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G foo 2026-03-24T17:32:50.788 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:32:50.783+0000 7f6bd3bef640 0 --2- 192.168.123.101:0/1929385815 >> [v2:192.168.123.101:3300/0,v1:192.168.123.101:6789/0] conn(0x561bb5cef600 0x561bb5db0a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T17:32:50.810 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create foo@snap 2026-03-24T17:32:51.582 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:32:51.593 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect foo@snap 2026-03-24T17:32:51.627 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone foo@snap bar 2026-03-24T17:32:51.685 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G rbd2/dummy 2026-03-24T17:32:51.716 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/dummy 2026-03-24T17:32:51.746 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-24T17:32:52.411 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:32:52.407+0000 7f4005d49640 0 -- 192.168.123.101:0/1608652932 >> [v2:192.168.123.101:6808/831386055,v1:192.168.123.101:6809/831386055] conn(0x560aa425ec70 msgr2=0x560aa42edab0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:32:52.829 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:32:52.829 INFO:tasks.workunit.client.0.vm01.stdout: 1 288 311.475 311 MiB/s 2026-03-24T17:32:53.770 INFO:tasks.workunit.client.0.vm01.stdout: 2 544 291.666 292 MiB/s 2026-03-24T17:32:54.799 INFO:tasks.workunit.client.0.vm01.stdout: 3 800 276.798 277 MiB/s 2026-03-24T17:32:55.762 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 4 ops: 1024 ops/sec: 254.98 bytes/sec: 255 MiB/s 2026-03-24T17:32:55.769 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd2/dummy@snap 2026-03-24T17:32:56.173 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:32:56.187 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect rbd2/dummy@snap 2026-03-24T17:32:56.216 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.216 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy1 2026-03-24T17:32:56.260 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.260 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy2 2026-03-24T17:32:56.300 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.300 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy3 2026-03-24T17:32:56.341 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.341 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy4 2026-03-24T17:32:56.380 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.380 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy5 2026-03-24T17:32:56.422 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:32:56.672 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:32:56.672 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:56.672 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/dummy1 2026-03-24T17:32:57.444 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 1, "id": "910689ba-ec1f-49d1-8e58-1fe083588f2a", "message": "Flattening image rbd2/dummy1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy1", "image_id": "32172570cbe5"}, "in_progress": true, "progress": 0.03515625} 2026-03-24T17:32:57.469 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:57.469 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/dummy2 2026-03-24T17:32:59.125 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 2, "id": "07174b8d-7a50-4251-8e3f-be71f530b24a", "message": "Flattening image rbd2/dummy2", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy2", "image_id": "3219f828bcc1"}} 2026-03-24T17:32:59.153 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:32:59.153 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/dummy3 2026-03-24T17:33:00.788 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 3, "id": "a1807bb8-6a8b-4fbd-9688-d5ecd40cc4a1", "message": "Flattening image rbd2/dummy3", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy3", "image_id": "321cbe7897ce"}} 2026-03-24T17:33:00.815 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:33:00.815 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/dummy4 2026-03-24T17:33:02.158 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 4, "id": "bc4b5336-1d76-4239-a3b7-28643b152f02", "message": "Flattening image rbd2/dummy4", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy4", "image_id": "321f78ce4d7e"}} 2026-03-24T17:33:02.188 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..5} 2026-03-24T17:33:02.188 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/dummy5 2026-03-24T17:33:03.285 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 5, "id": "f7a74e75-bc77-40de-b3cb-72043b5a6a83", "message": "Flattening image rbd2/dummy5", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy5", "image_id": "3222ae785ee5"}} 2026-03-24T17:33:03.316 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool delete rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:33:04.257 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:33:04.270 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[ 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "id": "07174b8d-7a50-4251-8e3f-be71f530b24a", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "message": "Flattening image rbd2/dummy2", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "refs": { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "action": "flatten", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_id": "3219f828bcc1", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_name": "dummy2", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_name": "rbd2", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_namespace": "" 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "sequence": 2 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "id": "a1807bb8-6a8b-4fbd-9688-d5ecd40cc4a1", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "message": "Flattening image rbd2/dummy3", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "refs": { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "action": "flatten", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_id": "321cbe7897ce", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_name": "dummy3", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_name": "rbd2", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_namespace": "" 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "sequence": 3 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "id": "bc4b5336-1d76-4239-a3b7-28643b152f02", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "message": "Flattening image rbd2/dummy4", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "refs": { 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "action": "flatten", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_id": "321f78ce4d7e", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "image_name": "dummy4", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_name": "rbd2", 2026-03-24T17:33:04.564 INFO:tasks.workunit.client.0.vm01.stderr: "pool_namespace": "" 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "sequence": 4 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: { 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "id": "f7a74e75-bc77-40de-b3cb-72043b5a6a83", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "message": "Flattening image rbd2/dummy5", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "refs": { 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "action": "flatten", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "image_id": "3222ae785ee5", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "image_name": "dummy5", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "pool_name": "rbd2", 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "pool_namespace": "" 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: "sequence": 5 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr:]' '!=' '[]' 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info bar 2026-03-24T17:33:04.565 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:33:04.647 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd/foo@snap 2026-03-24T17:33:04.647 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail rbd snap unprotect foo@snap 2026-03-24T17:33:04.647 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect foo@snap 2026-03-24T17:33:05.049 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:05.043+0000 7fe5ba8bf640 -1 librbd::SnapshotUnprotectRequest: cannot unprotect: at least 1 child(ren) [3208a979b319] in pool 'rbd' 2026-03-24T17:33:05.049 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:05.043+0000 7fe5ba8bf640 -1 librbd::SnapshotUnprotectRequest: encountered error: (16) Device or resource busy 2026-03-24T17:33:05.049 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:05.043+0000 7fe5ba8bf640 -1 librbd::SnapshotUnprotectRequest: 0x5629ea98fbc0 should_complete_error: ret_val=-16 2026-03-24T17:33:05.055 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:05.051+0000 7fe5bb0c0640 -1 librbd::SnapshotUnprotectRequest: 0x5629ea98fbc0 should_complete_error: ret_val=-16 2026-03-24T17:33:05.056 INFO:tasks.workunit.client.0.vm01.stderr:rbd: unprotecting snap failed: (16) Device or resource busy 2026-03-24T17:33:05.097 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:33:05.097 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten bar 2026-03-24T17:33:05.391 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 6, "id": "ae34cd88-a0d8-44e3-ae2d-2965d5aad097", "message": "Flattening image rbd/bar", "refs": {"action": "flatten", "pool_name": "rbd", "pool_namespace": "", "image_name": "bar", "image_id": "3208a979b319"}, "retry_attempts": 1, "retry_time": "2026-03-24T17:33:35.337863"} 2026-03-24T17:33:05.404 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..12} 2026-03-24T17:33:05.404 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info bar 2026-03-24T17:33:05.404 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:33:05.430 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:33:05.430 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info bar 2026-03-24T17:33:05.430 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'parent: ' 2026-03-24T17:33:05.430 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:33:05.456 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:33:05.456 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect foo@snap 2026-03-24T17:33:05.492 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..12} 2026-03-24T17:33:05.492 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:33:05.746 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:33:05.746 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:33:05.746 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:33:05.992 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:33:05.992 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:33:05.992 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.073 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.152 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.253 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.334 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.412 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.490 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.570 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.654 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:06.738 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.012 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.094 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.252 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.330 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.409 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.488 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.563 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.637 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.718 INFO:tasks.workunit.client.0.vm01.stdout:testing task handler recovery after module's RADOS client is blocklisted... 2026-03-24T17:33:07.718 INFO:tasks.workunit.client.0.vm01.stderr:+ test_tasks_recovery 2026-03-24T17:33:07.718 INFO:tasks.workunit.client.0.vm01.stderr:+ echo 'testing task handler recovery after module'\''s RADOS client is blocklisted...' 2026-03-24T17:33:07.718 INFO:tasks.workunit.client.0.vm01.stderr:+ remove_images 2026-03-24T17:33:07.718 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.801 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.883 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:07.963 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.038 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.121 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.200 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.276 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.359 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.440 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.523 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.806 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.888 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:08.969 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:09.050 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:09.129 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:09.209 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:09.293 INFO:tasks.workunit.client.0.vm01.stderr:+ for img in $IMGS 2026-03-24T17:33:09.379 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool create rbd2 8 2026-03-24T17:33:10.119 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' already exists 2026-03-24T17:33:10.132 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd pool init rbd2 2026-03-24T17:33:13.079 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd create --image-format 2 --size 1G rbd2/img1 2026-03-24T17:33:13.109 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/img1 2026-03-24T17:33:13.137 INFO:tasks.workunit.client.0.vm01.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-24T17:33:14.181 INFO:tasks.workunit.client.0.vm01.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T17:33:14.181 INFO:tasks.workunit.client.0.vm01.stdout: 1 272 303.797 304 MiB/s 2026-03-24T17:33:14.553 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:14.547+0000 7fa96eaee640 0 -- 192.168.123.101:0/513943635 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fa94c05c820 msgr2=0x7fa94c07cc20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T17:33:15.182 INFO:tasks.workunit.client.0.vm01.stdout: 2 512 270.492 270 MiB/s 2026-03-24T17:33:16.152 INFO:tasks.workunit.client.0.vm01.stdout: 3 752 263.013 263 MiB/s 2026-03-24T17:33:16.183 INFO:tasks.workunit.client.0.vm01.stderr:2026-03-24T17:33:16.179+0000 7fa96eaee640 0 -- 192.168.123.101:0/513943635 >> [v2:192.168.123.101:6816/1022245844,v1:192.168.123.101:6817/1022245844] conn(0x7fa94c05c820 msgr2=0x7fa94c07cc00 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T17:33:17.187 INFO:tasks.workunit.client.0.vm01.stdout: 4 1008 258.847 259 MiB/s 2026-03-24T17:33:17.299 INFO:tasks.workunit.client.0.vm01.stdout:elapsed: 4 ops: 1024 ops/sec: 246.154 bytes/sec: 246 MiB/s 2026-03-24T17:33:17.306 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap create rbd2/img1@snap 2026-03-24T17:33:18.171 INFO:tasks.workunit.client.0.vm01.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T17:33:18.177 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap protect rbd2/img1@snap 2026-03-24T17:33:18.205 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd clone rbd2/img1@snap rbd2/clone1 2026-03-24T17:33:18.244 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph mgr dump 2026-03-24T17:33:18.244 INFO:tasks.workunit.client.0.vm01.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T17:33:18.245 INFO:tasks.workunit.client.0.vm01.stderr:++ jq '.active_clients[]' 2026-03-24T17:33:18.245 INFO:tasks.workunit.client.0.vm01.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T17:33:18.500 INFO:tasks.workunit.client.0.vm01.stderr:+ CLIENT_ADDR=192.168.123.101:0/591039041 2026-03-24T17:33:18.500 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd blocklist add 192.168.123.101:0/591039041 2026-03-24T17:33:20.173 INFO:tasks.workunit.client.0.vm01.stderr:blocklisting 192.168.123.101:0/591039041 until 2026-03-24T18:33:19.223789+0000 (3600 sec) 2026-03-24T17:33:20.187 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail ceph rbd task add flatten rbd2/clone1 2026-03-24T17:33:20.187 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:33:20.356 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:20.351+0000 7f1bd9689640 -1 librbd::image::OpenRequest: failed to stat v2 image header: (108) Cannot send after transport endpoint shutdown 2026-03-24T17:33:20.356 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:20.351+0000 7f1bd9689640 -1 librbd::ImageState: 0x557d307bf700 failed to open image: (108) Cannot send after transport endpoint shutdown 2026-03-24T17:33:20.357 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:20.351+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error opening image b'clone1' at snapshot None) 2026-03-24T17:33:20.357 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: [errno 108] RBD connection was shutdown (error opening image b'clone1' at snapshot None) 2026-03-24T17:33:20.361 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:33:20.361 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:33:30.363 INFO:tasks.workunit.client.0.vm01.stderr:++ seq 24 2026-03-24T17:33:30.364 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:33:30.364 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:33:30.541 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:30.535+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:33:30.541 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T17:33:30.545 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:33:40.547 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:33:40.547 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:33:40.724 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:40.719+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:33:40.724 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T17:33:40.728 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:33:50.729 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:33:50.729 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:33:50.911 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:33:50.907+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:33:50.911 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T17:33:50.915 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:34:00.916 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:34:00.916 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:34:01.088 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:34:01.083+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:34:01.088 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T17:34:01.093 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:34:11.094 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:34:11.094 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:34:11.260 INFO:tasks.ceph.mgr.x.vm01.stderr:2026-03-24T17:34:11.255+0000 7f1bd5681640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T17:34:11.260 INFO:tasks.workunit.client.0.vm01.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T17:34:11.264 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:34:21.265 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in `seq 24` 2026-03-24T17:34:21.265 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T17:34:22.124 INFO:tasks.workunit.client.0.vm01.stdout:{"sequence": 1, "id": "f4154e12-0ecc-410f-a2a6-7610e56b68c4", "message": "Flattening image rbd2/clone1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "clone1", "image_id": "33396bf9d3b8"}, "in_progress": true, "progress": 0.03515625} 2026-03-24T17:34:22.139 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:34:22.139 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[ 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: { 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "id": "f4154e12-0ecc-410f-a2a6-7610e56b68c4", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "in_progress": true, 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "message": "Flattening image rbd2/clone1", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "progress": 0.17578125, 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "refs": { 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "action": "flatten", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "image_id": "33396bf9d3b8", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "image_name": "clone1", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "pool_name": "rbd2", 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "pool_namespace": "" 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: }, 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: "sequence": 1 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr: } 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr:]' '!=' '[]' 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..12} 2026-03-24T17:34:22.537 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:34:22.538 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info rbd2/clone1 2026-03-24T17:34:22.732 INFO:tasks.workunit.client.0.vm01.stdout: parent: rbd2/img1@snap 2026-03-24T17:34:22.732 INFO:tasks.workunit.client.0.vm01.stderr:+ sleep 10 2026-03-24T17:34:32.734 INFO:tasks.workunit.client.0.vm01.stderr:+ for i in {1..12} 2026-03-24T17:34:32.734 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info rbd2/clone1 2026-03-24T17:34:32.734 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:34:32.761 INFO:tasks.workunit.client.0.vm01.stderr:+ break 2026-03-24T17:34:32.761 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd info rbd2/clone1 2026-03-24T17:34:32.761 INFO:tasks.workunit.client.0.vm01.stderr:+ expect_fail grep 'parent: ' 2026-03-24T17:34:32.761 INFO:tasks.workunit.client.0.vm01.stderr:+ grep 'parent: ' 2026-03-24T17:34:32.788 INFO:tasks.workunit.client.0.vm01.stderr:+ return 0 2026-03-24T17:34:32.789 INFO:tasks.workunit.client.0.vm01.stderr:+ rbd snap unprotect rbd2/img1@snap 2026-03-24T17:34:32.820 INFO:tasks.workunit.client.0.vm01.stderr:++ ceph rbd task list 2026-03-24T17:34:33.068 INFO:tasks.workunit.client.0.vm01.stderr:+ test '[]' = '[]' 2026-03-24T17:34:33.068 INFO:tasks.workunit.client.0.vm01.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T17:34:33.557 INFO:tasks.workunit.client.0.vm01.stderr:pool 'rbd2' does not exist 2026-03-24T17:34:33.589 INFO:tasks.workunit.client.0.vm01.stdout:OK 2026-03-24T17:34:33.590 INFO:tasks.workunit.client.0.vm01.stderr:+ echo OK 2026-03-24T17:34:33.590 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-24T17:34:33.590 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-24T17:34:33.711 INFO:tasks.workunit:Stopping ['rbd/cli_generic.sh'] on client.0... 2026-03-24T17:34:33.711 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-24T17:34:34.284 DEBUG:teuthology.parallel:result is None 2026-03-24T17:34:34.284 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T17:34:34.291 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T17:34:34.291 DEBUG:teuthology.orchestra.run.vm01:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-24T17:34:34.341 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T17:34:34.341 DEBUG:teuthology.run_tasks:Unwinding manager ceph 2026-03-24T17:34:34.343 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-24T17:34:34.343 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T17:34:34.536 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:34.537 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T17:34:34.549 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":1491,"stamp":"2026-03-24T17:34:33.486096+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84995,"num_read_kb":234448,"num_write":44465,"num_write_kb":8898113,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":48068,"ondisk_log_size":48068,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":5053768,"kb_used_data":2103832,"kb_used_omap":1305,"kb_used_meta":2948582,"kb_avail":278061752,"statfs":{"total":289910292480,"available":284735234048,"internally_reserved":0,"allocated":2154323968,"data_stored":4298748957,"data_compressed":22549812,"data_compressed_allocated":2147926016,"data_compressed_original":4295852032,"omap_allocated":1336479,"internal_metadata":3019348833},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":55,"apply_latency_ms":55,"commit_latency_ns":55000000,"apply_latency_ns":55000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1287651673,"num_objects":-319,"num_object_clones":0,"num_object_copies":-638,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-319,"num_whiteouts":0,"num_read":-2493,"num_read_kb":-235453,"num_write":-1786,"num_write_kb":-1258125,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.369574"},"pg_stats":[{"pgid":"2.7","version":"237'4856","reported_seq":8583,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072549+0000","last_change":"2026-03-24T17:33:06.881321+0000","last_active":"2026-03-24T17:33:23.072549+0000","last_peered":"2026-03-24T17:33:23.072549+0000","last_clean":"2026-03-24T17:33:23.072549+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T17:33:23.072549+0000","last_undegraded":"2026-03-24T17:33:23.072549+0000","last_fullsized":"2026-03-24T17:33:23.072549+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4856,"log_dups_size":0,"ondisk_log_size":4856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:48:43.097690+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037862799999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6676,"num_read_kb":23533,"num_write":4384,"num_write_kb":1011585,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"237'5774","reported_seq":10094,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072700+0000","last_change":"2026-03-24T17:33:06.884172+0000","last_active":"2026-03-24T17:33:23.072700+0000","last_peered":"2026-03-24T17:33:23.072700+0000","last_clean":"2026-03-24T17:33:23.072700+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T17:33:23.072700+0000","last_undegraded":"2026-03-24T17:33:23.072700+0000","last_fullsized":"2026-03-24T17:33:23.072700+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5774,"log_dups_size":0,"ondisk_log_size":5774,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T03:27:23.829221+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0031340059999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7702,"num_read_kb":28675,"num_write":4426,"num_write_kb":1155866,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"237'7847","reported_seq":11386,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072577+0000","last_change":"2026-03-24T17:33:06.881391+0000","last_active":"2026-03-24T17:33:23.072577+0000","last_peered":"2026-03-24T17:33:23.072577+0000","last_clean":"2026-03-24T17:33:23.072577+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T17:33:23.072577+0000","last_undegraded":"2026-03-24T17:33:23.072577+0000","last_fullsized":"2026-03-24T17:33:23.072577+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":7847,"log_dups_size":0,"ondisk_log_size":7847,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T02:01:08.991385+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030837699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":25629,"num_read_kb":40418,"num_write":10625,"num_write_kb":1113062,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"237'6992","reported_seq":11439,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072616+0000","last_change":"2026-03-24T17:33:06.881834+0000","last_active":"2026-03-24T17:33:23.072616+0000","last_peered":"2026-03-24T17:33:23.072616+0000","last_clean":"2026-03-24T17:33:23.072616+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T17:33:23.072616+0000","last_undegraded":"2026-03-24T17:33:23.072616+0000","last_fullsized":"2026-03-24T17:33:23.072616+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6992,"log_dups_size":0,"ondisk_log_size":6992,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:24:38.787735+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00071319500000000002,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7479,"num_read_kb":28407,"num_write":5401,"num_write_kb":1218895,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"237'5026","reported_seq":11075,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.523256+0000","last_change":"2026-03-24T17:33:06.880075+0000","last_active":"2026-03-24T17:33:23.523256+0000","last_peered":"2026-03-24T17:33:23.523256+0000","last_clean":"2026-03-24T17:33:23.523256+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T17:33:23.523256+0000","last_undegraded":"2026-03-24T17:33:23.523256+0000","last_fullsized":"2026-03-24T17:33:23.523256+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5026,"log_dups_size":0,"ondisk_log_size":5026,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T01:11:42.583816+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00051557600000000001,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":9647,"num_read_kb":26613,"num_write":4477,"num_write_kb":1192636,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"237'4740","reported_seq":9380,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:34:14.417545+0000","last_change":"2026-03-24T17:33:06.880366+0000","last_active":"2026-03-24T17:34:14.417545+0000","last_peered":"2026-03-24T17:34:14.417545+0000","last_clean":"2026-03-24T17:34:14.417545+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T17:34:14.417545+0000","last_undegraded":"2026-03-24T17:34:14.417545+0000","last_fullsized":"2026-03-24T17:34:14.417545+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4740,"log_dups_size":0,"ondisk_log_size":4740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:36:48.582129+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026306099999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10556,"num_read_kb":31959,"num_write":4465,"num_write_kb":1068977,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"237'6037","reported_seq":9540,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:24.069466+0000","last_change":"2026-03-24T17:33:06.880259+0000","last_active":"2026-03-24T17:33:24.069466+0000","last_peered":"2026-03-24T17:33:24.069466+0000","last_clean":"2026-03-24T17:33:24.069466+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T17:33:24.069466+0000","last_undegraded":"2026-03-24T17:33:24.069466+0000","last_fullsized":"2026-03-24T17:33:24.069466+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6037,"log_dups_size":0,"ondisk_log_size":6037,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T00:11:47.602549+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00018387399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7404,"num_read_kb":25396,"num_write":5157,"num_write_kb":1049840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"237'6764","reported_seq":13178,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072731+0000","last_change":"2026-03-24T17:33:06.881887+0000","last_active":"2026-03-24T17:33:23.072731+0000","last_peered":"2026-03-24T17:33:23.072731+0000","last_clean":"2026-03-24T17:33:23.072731+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T17:33:23.072731+0000","last_undegraded":"2026-03-24T17:33:23.072731+0000","last_fullsized":"2026-03-24T17:33:23.072731+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6764,"log_dups_size":0,"ondisk_log_size":6764,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:45:16.308359+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00068371000000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":9856,"num_read_kb":29410,"num_write":5473,"num_write_kb":1086668,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":541,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072650+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T17:33:23.072650+0000","last_peered":"2026-03-24T17:33:23.072650+0000","last_clean":"2026-03-24T17:33:23.072650+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T17:33:23.072650+0000","last_undegraded":"2026-03-24T17:33:23.072650+0000","last_fullsized":"2026-03-24T17:33:23.072650+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84949,"num_read_kb":234411,"num_write":44408,"num_write_kb":8897529,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1304181,"internal_metadata":0},"log_size":48036,"ondisk_log_size":48036,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738870,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1382936,"kb_used_data":507912,"kb_used_omap":369,"kb_used_meta":874638,"kb_avail":92988904,"statfs":{"total":96636764160,"available":95220637696,"internally_reserved":0,"allocated":520101888,"data_stored":1036947321,"data_compressed":5437952,"data_compressed_allocated":517996544,"data_compressed_original":1035993088,"omap_allocated":378208,"internal_metadata":895629984},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":46,"apply_latency_ms":46,"commit_latency_ns":46000000,"apply_latency_ns":46000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738869,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1564436,"kb_used_data":545028,"kb_used_omap":548,"kb_used_meta":1018843,"kb_avail":92807404,"statfs":{"total":96636764160,"available":95034781696,"internally_reserved":0,"allocated":558108672,"data_stored":1112904217,"data_compressed":5836954,"data_compressed_allocated":555966464,"data_compressed_original":1111932928,"omap_allocated":561750,"internal_metadata":1043295658},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":9,"apply_latency_ms":9,"commit_latency_ns":9000000,"apply_latency_ns":9000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738872,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2106396,"kb_used_data":1050892,"kb_used_omap":387,"kb_used_meta":1055100,"kb_avail":92265444,"statfs":{"total":96636764160,"available":94479814656,"internally_reserved":0,"allocated":1076113408,"data_stored":2148897419,"data_compressed":11274906,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":396521,"internal_metadata":1080423191},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":395193,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":541804,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":367184,"internal_metadata":0}]}} 2026-03-24T17:34:34.549 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T17:34:34.702 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:34.702 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T17:34:34.715 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":1491,"stamp":"2026-03-24T17:34:33.486096+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84995,"num_read_kb":234448,"num_write":44465,"num_write_kb":8898113,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":48068,"ondisk_log_size":48068,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":5053768,"kb_used_data":2103832,"kb_used_omap":1305,"kb_used_meta":2948582,"kb_avail":278061752,"statfs":{"total":289910292480,"available":284735234048,"internally_reserved":0,"allocated":2154323968,"data_stored":4298748957,"data_compressed":22549812,"data_compressed_allocated":2147926016,"data_compressed_original":4295852032,"omap_allocated":1336479,"internal_metadata":3019348833},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":55,"apply_latency_ms":55,"commit_latency_ns":55000000,"apply_latency_ns":55000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1287651673,"num_objects":-319,"num_object_clones":0,"num_object_copies":-638,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-319,"num_whiteouts":0,"num_read":-2493,"num_read_kb":-235453,"num_write":-1786,"num_write_kb":-1258125,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.369574"},"pg_stats":[{"pgid":"2.7","version":"237'4856","reported_seq":8583,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072549+0000","last_change":"2026-03-24T17:33:06.881321+0000","last_active":"2026-03-24T17:33:23.072549+0000","last_peered":"2026-03-24T17:33:23.072549+0000","last_clean":"2026-03-24T17:33:23.072549+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T17:33:23.072549+0000","last_undegraded":"2026-03-24T17:33:23.072549+0000","last_fullsized":"2026-03-24T17:33:23.072549+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4856,"log_dups_size":0,"ondisk_log_size":4856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:48:43.097690+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037862799999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6676,"num_read_kb":23533,"num_write":4384,"num_write_kb":1011585,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"237'5774","reported_seq":10094,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072700+0000","last_change":"2026-03-24T17:33:06.884172+0000","last_active":"2026-03-24T17:33:23.072700+0000","last_peered":"2026-03-24T17:33:23.072700+0000","last_clean":"2026-03-24T17:33:23.072700+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T17:33:23.072700+0000","last_undegraded":"2026-03-24T17:33:23.072700+0000","last_fullsized":"2026-03-24T17:33:23.072700+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5774,"log_dups_size":0,"ondisk_log_size":5774,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T03:27:23.829221+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0031340059999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7702,"num_read_kb":28675,"num_write":4426,"num_write_kb":1155866,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"237'7847","reported_seq":11386,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072577+0000","last_change":"2026-03-24T17:33:06.881391+0000","last_active":"2026-03-24T17:33:23.072577+0000","last_peered":"2026-03-24T17:33:23.072577+0000","last_clean":"2026-03-24T17:33:23.072577+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T17:33:23.072577+0000","last_undegraded":"2026-03-24T17:33:23.072577+0000","last_fullsized":"2026-03-24T17:33:23.072577+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":7847,"log_dups_size":0,"ondisk_log_size":7847,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T02:01:08.991385+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030837699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":25629,"num_read_kb":40418,"num_write":10625,"num_write_kb":1113062,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"237'6992","reported_seq":11439,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072616+0000","last_change":"2026-03-24T17:33:06.881834+0000","last_active":"2026-03-24T17:33:23.072616+0000","last_peered":"2026-03-24T17:33:23.072616+0000","last_clean":"2026-03-24T17:33:23.072616+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T17:33:23.072616+0000","last_undegraded":"2026-03-24T17:33:23.072616+0000","last_fullsized":"2026-03-24T17:33:23.072616+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6992,"log_dups_size":0,"ondisk_log_size":6992,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:24:38.787735+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00071319500000000002,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7479,"num_read_kb":28407,"num_write":5401,"num_write_kb":1218895,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"237'5026","reported_seq":11075,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.523256+0000","last_change":"2026-03-24T17:33:06.880075+0000","last_active":"2026-03-24T17:33:23.523256+0000","last_peered":"2026-03-24T17:33:23.523256+0000","last_clean":"2026-03-24T17:33:23.523256+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T17:33:23.523256+0000","last_undegraded":"2026-03-24T17:33:23.523256+0000","last_fullsized":"2026-03-24T17:33:23.523256+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5026,"log_dups_size":0,"ondisk_log_size":5026,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T01:11:42.583816+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00051557600000000001,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":9647,"num_read_kb":26613,"num_write":4477,"num_write_kb":1192636,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"237'4740","reported_seq":9380,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:34:14.417545+0000","last_change":"2026-03-24T17:33:06.880366+0000","last_active":"2026-03-24T17:34:14.417545+0000","last_peered":"2026-03-24T17:34:14.417545+0000","last_clean":"2026-03-24T17:34:14.417545+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T17:34:14.417545+0000","last_undegraded":"2026-03-24T17:34:14.417545+0000","last_fullsized":"2026-03-24T17:34:14.417545+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4740,"log_dups_size":0,"ondisk_log_size":4740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:36:48.582129+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026306099999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10556,"num_read_kb":31959,"num_write":4465,"num_write_kb":1068977,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"237'6037","reported_seq":9540,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:24.069466+0000","last_change":"2026-03-24T17:33:06.880259+0000","last_active":"2026-03-24T17:33:24.069466+0000","last_peered":"2026-03-24T17:33:24.069466+0000","last_clean":"2026-03-24T17:33:24.069466+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T17:33:24.069466+0000","last_undegraded":"2026-03-24T17:33:24.069466+0000","last_fullsized":"2026-03-24T17:33:24.069466+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6037,"log_dups_size":0,"ondisk_log_size":6037,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T00:11:47.602549+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00018387399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7404,"num_read_kb":25396,"num_write":5157,"num_write_kb":1049840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"237'6764","reported_seq":13178,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072731+0000","last_change":"2026-03-24T17:33:06.881887+0000","last_active":"2026-03-24T17:33:23.072731+0000","last_peered":"2026-03-24T17:33:23.072731+0000","last_clean":"2026-03-24T17:33:23.072731+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T17:33:23.072731+0000","last_undegraded":"2026-03-24T17:33:23.072731+0000","last_fullsized":"2026-03-24T17:33:23.072731+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6764,"log_dups_size":0,"ondisk_log_size":6764,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:45:16.308359+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00068371000000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":9856,"num_read_kb":29410,"num_write":5473,"num_write_kb":1086668,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":541,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072650+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T17:33:23.072650+0000","last_peered":"2026-03-24T17:33:23.072650+0000","last_clean":"2026-03-24T17:33:23.072650+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T17:33:23.072650+0000","last_undegraded":"2026-03-24T17:33:23.072650+0000","last_fullsized":"2026-03-24T17:33:23.072650+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84949,"num_read_kb":234411,"num_write":44408,"num_write_kb":8897529,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1304181,"internal_metadata":0},"log_size":48036,"ondisk_log_size":48036,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738870,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1382936,"kb_used_data":507912,"kb_used_omap":369,"kb_used_meta":874638,"kb_avail":92988904,"statfs":{"total":96636764160,"available":95220637696,"internally_reserved":0,"allocated":520101888,"data_stored":1036947321,"data_compressed":5437952,"data_compressed_allocated":517996544,"data_compressed_original":1035993088,"omap_allocated":378208,"internal_metadata":895629984},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":46,"apply_latency_ms":46,"commit_latency_ns":46000000,"apply_latency_ns":46000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738869,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1564436,"kb_used_data":545028,"kb_used_omap":548,"kb_used_meta":1018843,"kb_avail":92807404,"statfs":{"total":96636764160,"available":95034781696,"internally_reserved":0,"allocated":558108672,"data_stored":1112904217,"data_compressed":5836954,"data_compressed_allocated":555966464,"data_compressed_original":1111932928,"omap_allocated":561750,"internal_metadata":1043295658},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":9,"apply_latency_ms":9,"commit_latency_ns":9000000,"apply_latency_ns":9000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738872,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2106396,"kb_used_data":1050892,"kb_used_omap":387,"kb_used_meta":1055100,"kb_avail":92265444,"statfs":{"total":96636764160,"available":94479814656,"internally_reserved":0,"allocated":1076113408,"data_stored":2148897419,"data_compressed":11274906,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":396521,"internal_metadata":1080423191},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":395193,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":541804,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":367184,"internal_metadata":0}]}} 2026-03-24T17:34:34.716 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-24T17:34:34.716 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T17:34:34.868 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:34.868 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T17:34:34.881 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":1491,"stamp":"2026-03-24T17:34:33.486096+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84995,"num_read_kb":234448,"num_write":44465,"num_write_kb":8898113,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":48068,"ondisk_log_size":48068,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":5053768,"kb_used_data":2103832,"kb_used_omap":1305,"kb_used_meta":2948582,"kb_avail":278061752,"statfs":{"total":289910292480,"available":284735234048,"internally_reserved":0,"allocated":2154323968,"data_stored":4298748957,"data_compressed":22549812,"data_compressed_allocated":2147926016,"data_compressed_original":4295852032,"omap_allocated":1336479,"internal_metadata":3019348833},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":55,"apply_latency_ms":55,"commit_latency_ns":55000000,"apply_latency_ns":55000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1287651673,"num_objects":-319,"num_object_clones":0,"num_object_copies":-638,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-319,"num_whiteouts":0,"num_read":-2493,"num_read_kb":-235453,"num_write":-1786,"num_write_kb":-1258125,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.369574"},"pg_stats":[{"pgid":"2.7","version":"237'4856","reported_seq":8583,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072549+0000","last_change":"2026-03-24T17:33:06.881321+0000","last_active":"2026-03-24T17:33:23.072549+0000","last_peered":"2026-03-24T17:33:23.072549+0000","last_clean":"2026-03-24T17:33:23.072549+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T17:33:23.072549+0000","last_undegraded":"2026-03-24T17:33:23.072549+0000","last_fullsized":"2026-03-24T17:33:23.072549+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4856,"log_dups_size":0,"ondisk_log_size":4856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:48:43.097690+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037862799999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6676,"num_read_kb":23533,"num_write":4384,"num_write_kb":1011585,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"237'5774","reported_seq":10094,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072700+0000","last_change":"2026-03-24T17:33:06.884172+0000","last_active":"2026-03-24T17:33:23.072700+0000","last_peered":"2026-03-24T17:33:23.072700+0000","last_clean":"2026-03-24T17:33:23.072700+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T17:33:23.072700+0000","last_undegraded":"2026-03-24T17:33:23.072700+0000","last_fullsized":"2026-03-24T17:33:23.072700+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5774,"log_dups_size":0,"ondisk_log_size":5774,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T03:27:23.829221+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0031340059999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7702,"num_read_kb":28675,"num_write":4426,"num_write_kb":1155866,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"237'7847","reported_seq":11386,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072577+0000","last_change":"2026-03-24T17:33:06.881391+0000","last_active":"2026-03-24T17:33:23.072577+0000","last_peered":"2026-03-24T17:33:23.072577+0000","last_clean":"2026-03-24T17:33:23.072577+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T17:33:23.072577+0000","last_undegraded":"2026-03-24T17:33:23.072577+0000","last_fullsized":"2026-03-24T17:33:23.072577+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":7847,"log_dups_size":0,"ondisk_log_size":7847,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T02:01:08.991385+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030837699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":25629,"num_read_kb":40418,"num_write":10625,"num_write_kb":1113062,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"237'6992","reported_seq":11439,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072616+0000","last_change":"2026-03-24T17:33:06.881834+0000","last_active":"2026-03-24T17:33:23.072616+0000","last_peered":"2026-03-24T17:33:23.072616+0000","last_clean":"2026-03-24T17:33:23.072616+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T17:33:23.072616+0000","last_undegraded":"2026-03-24T17:33:23.072616+0000","last_fullsized":"2026-03-24T17:33:23.072616+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6992,"log_dups_size":0,"ondisk_log_size":6992,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:24:38.787735+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00071319500000000002,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7479,"num_read_kb":28407,"num_write":5401,"num_write_kb":1218895,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"237'5026","reported_seq":11075,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.523256+0000","last_change":"2026-03-24T17:33:06.880075+0000","last_active":"2026-03-24T17:33:23.523256+0000","last_peered":"2026-03-24T17:33:23.523256+0000","last_clean":"2026-03-24T17:33:23.523256+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T17:33:23.523256+0000","last_undegraded":"2026-03-24T17:33:23.523256+0000","last_fullsized":"2026-03-24T17:33:23.523256+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5026,"log_dups_size":0,"ondisk_log_size":5026,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T01:11:42.583816+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00051557600000000001,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":9647,"num_read_kb":26613,"num_write":4477,"num_write_kb":1192636,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"237'4740","reported_seq":9380,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:34:14.417545+0000","last_change":"2026-03-24T17:33:06.880366+0000","last_active":"2026-03-24T17:34:14.417545+0000","last_peered":"2026-03-24T17:34:14.417545+0000","last_clean":"2026-03-24T17:34:14.417545+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T17:34:14.417545+0000","last_undegraded":"2026-03-24T17:34:14.417545+0000","last_fullsized":"2026-03-24T17:34:14.417545+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4740,"log_dups_size":0,"ondisk_log_size":4740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:36:48.582129+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026306099999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10556,"num_read_kb":31959,"num_write":4465,"num_write_kb":1068977,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"237'6037","reported_seq":9540,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:24.069466+0000","last_change":"2026-03-24T17:33:06.880259+0000","last_active":"2026-03-24T17:33:24.069466+0000","last_peered":"2026-03-24T17:33:24.069466+0000","last_clean":"2026-03-24T17:33:24.069466+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T17:33:24.069466+0000","last_undegraded":"2026-03-24T17:33:24.069466+0000","last_fullsized":"2026-03-24T17:33:24.069466+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6037,"log_dups_size":0,"ondisk_log_size":6037,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T00:11:47.602549+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00018387399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7404,"num_read_kb":25396,"num_write":5157,"num_write_kb":1049840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"237'6764","reported_seq":13178,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072731+0000","last_change":"2026-03-24T17:33:06.881887+0000","last_active":"2026-03-24T17:33:23.072731+0000","last_peered":"2026-03-24T17:33:23.072731+0000","last_clean":"2026-03-24T17:33:23.072731+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T17:33:23.072731+0000","last_undegraded":"2026-03-24T17:33:23.072731+0000","last_fullsized":"2026-03-24T17:33:23.072731+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6764,"log_dups_size":0,"ondisk_log_size":6764,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:45:16.308359+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00068371000000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":9856,"num_read_kb":29410,"num_write":5473,"num_write_kb":1086668,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":541,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072650+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T17:33:23.072650+0000","last_peered":"2026-03-24T17:33:23.072650+0000","last_clean":"2026-03-24T17:33:23.072650+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T17:33:23.072650+0000","last_undegraded":"2026-03-24T17:33:23.072650+0000","last_fullsized":"2026-03-24T17:33:23.072650+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84949,"num_read_kb":234411,"num_write":44408,"num_write_kb":8897529,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1304181,"internal_metadata":0},"log_size":48036,"ondisk_log_size":48036,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738870,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1382936,"kb_used_data":507912,"kb_used_omap":369,"kb_used_meta":874638,"kb_avail":92988904,"statfs":{"total":96636764160,"available":95220637696,"internally_reserved":0,"allocated":520101888,"data_stored":1036947321,"data_compressed":5437952,"data_compressed_allocated":517996544,"data_compressed_original":1035993088,"omap_allocated":378208,"internal_metadata":895629984},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":46,"apply_latency_ms":46,"commit_latency_ns":46000000,"apply_latency_ns":46000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738869,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1564436,"kb_used_data":545028,"kb_used_omap":548,"kb_used_meta":1018843,"kb_avail":92807404,"statfs":{"total":96636764160,"available":95034781696,"internally_reserved":0,"allocated":558108672,"data_stored":1112904217,"data_compressed":5836954,"data_compressed_allocated":555966464,"data_compressed_original":1111932928,"omap_allocated":561750,"internal_metadata":1043295658},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":9,"apply_latency_ms":9,"commit_latency_ns":9000000,"apply_latency_ns":9000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738872,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2106396,"kb_used_data":1050892,"kb_used_omap":387,"kb_used_meta":1055100,"kb_avail":92265444,"statfs":{"total":96636764160,"available":94479814656,"internally_reserved":0,"allocated":1076113408,"data_stored":2148897419,"data_compressed":11274906,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":396521,"internal_metadata":1080423191},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":395193,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":541804,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":367184,"internal_metadata":0}]}} 2026-03-24T17:34:34.882 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T17:34:35.033 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:35.033 INFO:teuthology.orchestra.run.vm01.stdout:{"epoch":246,"fsid":"dc403d64-7ddd-4e06-90c8-8f9c41489fa2","created":"2026-03-24T16:52:51.402550+0000","modified":"2026-03-24T17:34:33.481749+0000","last_up_change":"2026-03-24T16:52:56.653061+0000","last_in_change":"2026-03-24T16:52:52.145302+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":17,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T16:52:56.800817+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T16:52:59.857305+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"237","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":50,"snap_epoch":237,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"60e193c7-684f-428f-8239-e42abf940efd","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":238,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6816","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6817","nonce":1022245844}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6818","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6819","nonce":1022245844}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6822","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6823","nonce":1022245844}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6820","nonce":1022245844},{"type":"v1","addr":"192.168.123.101:6821","nonce":1022245844}]},"public_addr":"192.168.123.101:6817/1022245844","cluster_addr":"192.168.123.101:6819/1022245844","heartbeat_back_addr":"192.168.123.101:6823/1022245844","heartbeat_front_addr":"192.168.123.101:6821/1022245844","state":["exists","up"]},{"osd":1,"uuid":"3fdd8338-b04d-4ff4-a2f5-1de82f2b325c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":238,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6800","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6801","nonce":4203820349}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6802","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6803","nonce":4203820349}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6806","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6807","nonce":4203820349}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6804","nonce":4203820349},{"type":"v1","addr":"192.168.123.101:6805","nonce":4203820349}]},"public_addr":"192.168.123.101:6801/4203820349","cluster_addr":"192.168.123.101:6803/4203820349","heartbeat_back_addr":"192.168.123.101:6807/4203820349","heartbeat_front_addr":"192.168.123.101:6805/4203820349","state":["exists","up"]},{"osd":2,"uuid":"10385bfd-c8a3-402e-825a-5468ed42a5f9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":229,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6808","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6809","nonce":831386055}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6810","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6811","nonce":831386055}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6814","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6815","nonce":831386055}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.101:6812","nonce":831386055},{"type":"v1","addr":"192.168.123.101:6813","nonce":831386055}]},"public_addr":"192.168.123.101:6809/831386055","cluster_addr":"192.168.123.101:6811/831386055","heartbeat_back_addr":"192.168.123.101:6815/831386055","heartbeat_front_addr":"192.168.123.101:6813/831386055","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T16:52:54.914038+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T16:52:54.763667+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T16:52:54.946337+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.101:0/3800234888":"2026-03-24T18:31:42.592107+0000","192.168.123.101:0/1914320998":"2026-03-24T18:24:16.116944+0000","192.168.123.101:0/591039041":"2026-03-24T18:33:18.650884+0000","192.168.123.101:0/1643403251":"2026-03-24T18:19:31.357475+0000","192.168.123.101:0/3777821978":"2026-03-24T18:19:30.381266+0000","192.168.123.101:0/2946920246":"2026-03-24T18:19:29.522138+0000","192.168.123.101:0/1080676727":"2026-03-24T18:19:28.333133+0000","192.168.123.101:0/2911768874":"2026-03-24T18:29:10.820036+0000","192.168.123.101:0/892743974":"2026-03-24T18:18:45.335249+0000","192.168.123.101:0/2117733755":"2026-03-24T18:18:44.689891+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":3,"snaps":[{"begin":3,"length":1}]},{"pool":4,"snaps":[{"begin":2,"length":2}]},{"pool":6,"snaps":[{"begin":2,"length":2}]},{"pool":10,"snaps":[{"begin":3,"length":1}]},{"pool":14,"snaps":[{"begin":2,"length":1}]},{"pool":15,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T17:34:36.046 INFO:tasks.ceph:Scrubbing osd.0 2026-03-24T17:34:36.046 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 config set osd_debug_deep_scrub_sleep 0 2026-03-24T17:34:36.124 INFO:teuthology.orchestra.run.vm01.stdout:{ 2026-03-24T17:34:36.124 INFO:teuthology.orchestra.run.vm01.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T17:34:36.124 INFO:teuthology.orchestra.run.vm01.stdout:} 2026-03-24T17:34:36.134 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 0 2026-03-24T17:34:36.287 INFO:teuthology.orchestra.run.vm01.stderr:instructed osd(s) 0 to deep-scrub 2026-03-24T17:34:36.300 INFO:tasks.ceph:Scrubbing osd.1 2026-03-24T17:34:36.300 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 config set osd_debug_deep_scrub_sleep 0 2026-03-24T17:34:36.375 INFO:teuthology.orchestra.run.vm01.stdout:{ 2026-03-24T17:34:36.375 INFO:teuthology.orchestra.run.vm01.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T17:34:36.375 INFO:teuthology.orchestra.run.vm01.stdout:} 2026-03-24T17:34:36.385 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 1 2026-03-24T17:34:36.543 INFO:teuthology.orchestra.run.vm01.stderr:instructed osd(s) 1 to deep-scrub 2026-03-24T17:34:36.556 INFO:tasks.ceph:Scrubbing osd.2 2026-03-24T17:34:36.556 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 config set osd_debug_deep_scrub_sleep 0 2026-03-24T17:34:36.631 INFO:teuthology.orchestra.run.vm01.stdout:{ 2026-03-24T17:34:36.631 INFO:teuthology.orchestra.run.vm01.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T17:34:36.631 INFO:teuthology.orchestra.run.vm01.stdout:} 2026-03-24T17:34:36.642 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 2 2026-03-24T17:34:36.796 INFO:teuthology.orchestra.run.vm01.stderr:instructed osd(s) 2 to deep-scrub 2026-03-24T17:34:36.809 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T17:34:36.966 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:36.966 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T17:34:36.979 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":1492,"stamp":"2026-03-24T17:34:35.118073+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84995,"num_read_kb":234448,"num_write":44465,"num_write_kb":8898113,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":48068,"ondisk_log_size":48068,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":4760916,"kb_used_data":1810972,"kb_used_omap":1305,"kb_used_meta":2948582,"kb_avail":278354604,"statfs":{"total":289910292480,"available":285035114496,"internally_reserved":0,"allocated":1854435328,"data_stored":3698967465,"data_compressed":19401524,"data_compressed_allocated":1848033280,"data_compressed_original":3696066560,"omap_allocated":1336606,"internal_metadata":3019348706},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":59,"apply_latency_ms":59,"commit_latency_ns":59000000,"apply_latency_ns":59000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1287651673,"num_objects":-319,"num_object_clones":0,"num_object_copies":-638,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-319,"num_whiteouts":0,"num_read":-2493,"num_read_kb":-235453,"num_write":-1786,"num_write_kb":-1258125,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001300"},"pg_stats":[{"pgid":"2.7","version":"237'4856","reported_seq":8583,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072549+0000","last_change":"2026-03-24T17:33:06.881321+0000","last_active":"2026-03-24T17:33:23.072549+0000","last_peered":"2026-03-24T17:33:23.072549+0000","last_clean":"2026-03-24T17:33:23.072549+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T17:33:23.072549+0000","last_undegraded":"2026-03-24T17:33:23.072549+0000","last_fullsized":"2026-03-24T17:33:23.072549+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4856,"log_dups_size":0,"ondisk_log_size":4856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:48:43.097690+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037862799999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6676,"num_read_kb":23533,"num_write":4384,"num_write_kb":1011585,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"237'5774","reported_seq":10094,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072700+0000","last_change":"2026-03-24T17:33:06.884172+0000","last_active":"2026-03-24T17:33:23.072700+0000","last_peered":"2026-03-24T17:33:23.072700+0000","last_clean":"2026-03-24T17:33:23.072700+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T17:33:23.072700+0000","last_undegraded":"2026-03-24T17:33:23.072700+0000","last_fullsized":"2026-03-24T17:33:23.072700+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5774,"log_dups_size":0,"ondisk_log_size":5774,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T03:27:23.829221+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0031340059999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7702,"num_read_kb":28675,"num_write":4426,"num_write_kb":1155866,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"237'7847","reported_seq":11386,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072577+0000","last_change":"2026-03-24T17:33:06.881391+0000","last_active":"2026-03-24T17:33:23.072577+0000","last_peered":"2026-03-24T17:33:23.072577+0000","last_clean":"2026-03-24T17:33:23.072577+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T17:33:23.072577+0000","last_undegraded":"2026-03-24T17:33:23.072577+0000","last_fullsized":"2026-03-24T17:33:23.072577+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":7847,"log_dups_size":0,"ondisk_log_size":7847,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T02:01:08.991385+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00030837699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":25629,"num_read_kb":40418,"num_write":10625,"num_write_kb":1113062,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"237'6992","reported_seq":11439,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072616+0000","last_change":"2026-03-24T17:33:06.881834+0000","last_active":"2026-03-24T17:33:23.072616+0000","last_peered":"2026-03-24T17:33:23.072616+0000","last_clean":"2026-03-24T17:33:23.072616+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T17:33:23.072616+0000","last_undegraded":"2026-03-24T17:33:23.072616+0000","last_fullsized":"2026-03-24T17:33:23.072616+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6992,"log_dups_size":0,"ondisk_log_size":6992,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:24:38.787735+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00071319500000000002,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7479,"num_read_kb":28407,"num_write":5401,"num_write_kb":1218895,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"237'5026","reported_seq":11075,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.523256+0000","last_change":"2026-03-24T17:33:06.880075+0000","last_active":"2026-03-24T17:33:23.523256+0000","last_peered":"2026-03-24T17:33:23.523256+0000","last_clean":"2026-03-24T17:33:23.523256+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T17:33:23.523256+0000","last_undegraded":"2026-03-24T17:33:23.523256+0000","last_fullsized":"2026-03-24T17:33:23.523256+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":5026,"log_dups_size":0,"ondisk_log_size":5026,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T01:11:42.583816+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00051557600000000001,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":9647,"num_read_kb":26613,"num_write":4477,"num_write_kb":1192636,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"237'4740","reported_seq":9382,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:33.489498+0000","last_change":"2026-03-24T17:33:06.880366+0000","last_active":"2026-03-24T17:34:33.489498+0000","last_peered":"2026-03-24T17:34:33.489498+0000","last_clean":"2026-03-24T17:34:33.489498+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T17:34:33.489498+0000","last_undegraded":"2026-03-24T17:34:33.489498+0000","last_fullsized":"2026-03-24T17:34:33.489498+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":4740,"log_dups_size":0,"ondisk_log_size":4740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:36:48.582129+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026306099999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10556,"num_read_kb":31959,"num_write":4465,"num_write_kb":1068977,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"237'6037","reported_seq":9542,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:33.489450+0000","last_change":"2026-03-24T17:33:06.880259+0000","last_active":"2026-03-24T17:34:33.489450+0000","last_peered":"2026-03-24T17:34:33.489450+0000","last_clean":"2026-03-24T17:34:33.489450+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T17:34:33.489450+0000","last_undegraded":"2026-03-24T17:34:33.489450+0000","last_fullsized":"2026-03-24T17:34:33.489450+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6037,"log_dups_size":0,"ondisk_log_size":6037,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T00:11:47.602549+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00018387399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7404,"num_read_kb":25396,"num_write":5157,"num_write_kb":1049840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"237'6764","reported_seq":13178,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072731+0000","last_change":"2026-03-24T17:33:06.881887+0000","last_active":"2026-03-24T17:33:23.072731+0000","last_peered":"2026-03-24T17:33:23.072731+0000","last_clean":"2026-03-24T17:33:23.072731+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T17:33:23.072731+0000","last_undegraded":"2026-03-24T17:33:23.072731+0000","last_fullsized":"2026-03-24T17:33:23.072731+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:53:00.810392+0000","last_clean_scrub_stamp":"2026-03-24T16:53:00.810392+0000","objects_scrubbed":0,"log_size":6764,"log_dups_size":0,"ondisk_log_size":6764,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:45:16.308359+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00068371000000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":9856,"num_read_kb":29410,"num_write":5473,"num_write_kb":1086668,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":541,"reported_epoch":245,"state":"active+clean","last_fresh":"2026-03-24T17:33:23.072650+0000","last_change":"2026-03-24T16:52:58.801131+0000","last_active":"2026-03-24T17:33:23.072650+0000","last_peered":"2026-03-24T17:33:23.072650+0000","last_clean":"2026-03-24T17:33:23.072650+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T17:33:23.072650+0000","last_undegraded":"2026-03-24T17:33:23.072650+0000","last_fullsized":"2026-03-24T17:33:23.072650+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T16:52:57.792644+0000","last_clean_scrub_stamp":"2026-03-24T16:52:57.792644+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:11:40.022667+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84949,"num_read_kb":234411,"num_write":44408,"num_write_kb":8897529,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1304308,"internal_metadata":0},"log_size":48036,"ondisk_log_size":48036,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738871,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1090084,"kb_used_data":215052,"kb_used_omap":369,"kb_used_meta":874638,"kb_avail":93281756,"statfs":{"total":96636764160,"available":95520518144,"internally_reserved":0,"allocated":220213248,"data_stored":437165829,"data_compressed":2289664,"data_compressed_allocated":218103808,"data_compressed_original":436207616,"omap_allocated":378335,"internal_metadata":895629857},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":50,"apply_latency_ms":50,"commit_latency_ns":50000000,"apply_latency_ns":50000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738869,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1564436,"kb_used_data":545028,"kb_used_omap":548,"kb_used_meta":1018843,"kb_avail":92807404,"statfs":{"total":96636764160,"available":95034781696,"internally_reserved":0,"allocated":558108672,"data_stored":1112904217,"data_compressed":5836954,"data_compressed_allocated":555966464,"data_compressed_original":1111932928,"omap_allocated":561750,"internal_metadata":1043295658},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,0,0,1,0,1,1],"upper_bound":256},"perf_stat":{"commit_latency_ms":9,"apply_latency_ms":9,"commit_latency_ns":9000000,"apply_latency_ns":9000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738872,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":2106396,"kb_used_data":1050892,"kb_used_omap":387,"kb_used_meta":1055100,"kb_avail":92265444,"statfs":{"total":96636764160,"available":94479814656,"internally_reserved":0,"allocated":1076113408,"data_stored":2148897419,"data_compressed":11274906,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":396521,"internal_metadata":1080423191},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":395193,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":541804,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":367311,"internal_metadata":0}]}} 2026-03-24T17:34:36.980 INFO:tasks.ceph:pgid 2.7 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.980 INFO:tasks.ceph:pgid 2.6 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.980 INFO:tasks.ceph:pgid 2.5 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.980 INFO:tasks.ceph:pgid 2.4 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:pgid 2.2 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:pgid 2.1 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:pgid 2.0 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:pgid 2.3 last_scrub_stamp 2026-03-24T16:53:00.810392+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=53, tm_sec=0, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:pgid 1.0 last_scrub_stamp 2026-03-24T16:52:57.792644+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=16, tm_min=52, tm_sec=57, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=17, tm_min=34, tm_sec=35, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T17:34:36.981 INFO:tasks.ceph:Still waiting for all pgs to be scrubbed. 2026-03-24T17:34:56.981 DEBUG:teuthology.orchestra.run.vm01:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T17:34:57.135 INFO:teuthology.orchestra.run.vm01.stdout: 2026-03-24T17:34:57.135 INFO:teuthology.orchestra.run.vm01.stderr:dumped all 2026-03-24T17:34:57.148 INFO:teuthology.orchestra.run.vm01.stdout:{"pg_ready":true,"pg_map":{"version":1503,"stamp":"2026-03-24T17:34:57.120773+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84995,"num_read_kb":234448,"num_write":44465,"num_write_kb":8898113,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":48077,"ondisk_log_size":48077,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":2956604,"kb_used_data":6668,"kb_used_omap":1310,"kb_used_meta":2948577,"kb_avail":280158916,"statfs":{"total":289910292480,"available":286882729984,"internally_reserved":0,"allocated":6828032,"data_stored":3793253,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":1341976,"internal_metadata":3019343336},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"12.001413"},"pg_stats":[{"pgid":"2.7","version":"237'4856","reported_seq":8593,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:41.007834+0000","last_change":"2026-03-24T17:34:41.007834+0000","last_active":"2026-03-24T17:34:41.007834+0000","last_peered":"2026-03-24T17:34:41.007834+0000","last_clean":"2026-03-24T17:34:41.007834+0000","last_became_active":"2026-03-24T16:53:01.825454+0000","last_became_peered":"2026-03-24T16:53:01.825454+0000","last_unstale":"2026-03-24T17:34:41.007834+0000","last_undegraded":"2026-03-24T17:34:41.007834+0000","last_fullsized":"2026-03-24T17:34:41.007834+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"237'4856","last_scrub_stamp":"2026-03-24T17:34:41.007774+0000","last_deep_scrub":"237'4856","last_deep_scrub_stamp":"2026-03-24T17:34:41.007774+0000","last_clean_scrub_stamp":"2026-03-24T17:34:41.007774+0000","objects_scrubbed":0,"log_size":4856,"log_dups_size":0,"ondisk_log_size":4856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T01:42:48.001216+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.00037862799999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":6676,"num_read_kb":23533,"num_write":4384,"num_write_kb":1011585,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"237'5774","reported_seq":10104,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:39.027721+0000","last_change":"2026-03-24T17:34:39.027721+0000","last_active":"2026-03-24T17:34:39.027721+0000","last_peered":"2026-03-24T17:34:39.027721+0000","last_clean":"2026-03-24T17:34:39.027721+0000","last_became_active":"2026-03-24T16:53:01.825605+0000","last_became_peered":"2026-03-24T16:53:01.825605+0000","last_unstale":"2026-03-24T17:34:39.027721+0000","last_undegraded":"2026-03-24T17:34:39.027721+0000","last_fullsized":"2026-03-24T17:34:39.027721+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"237'5774","last_scrub_stamp":"2026-03-24T17:34:39.027663+0000","last_deep_scrub":"237'5774","last_deep_scrub_stamp":"2026-03-24T17:34:39.027663+0000","last_clean_scrub_stamp":"2026-03-24T17:34:39.027663+0000","objects_scrubbed":0,"log_size":5774,"log_dups_size":0,"ondisk_log_size":5774,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:00:24.110287+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.0031340059999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7702,"num_read_kb":28675,"num_write":4426,"num_write_kb":1155866,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"237'7847","reported_seq":11397,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:40.038294+0000","last_change":"2026-03-24T17:34:40.038261+0000","last_active":"2026-03-24T17:34:40.038294+0000","last_peered":"2026-03-24T17:34:40.038294+0000","last_clean":"2026-03-24T17:34:40.038294+0000","last_became_active":"2026-03-24T16:53:01.825817+0000","last_became_peered":"2026-03-24T16:53:01.825817+0000","last_unstale":"2026-03-24T17:34:40.038294+0000","last_undegraded":"2026-03-24T17:34:40.038294+0000","last_fullsized":"2026-03-24T17:34:40.038294+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"237'7847","last_scrub_stamp":"2026-03-24T17:34:40.038181+0000","last_deep_scrub":"237'7847","last_deep_scrub_stamp":"2026-03-24T17:34:40.038181+0000","last_clean_scrub_stamp":"2026-03-24T17:34:40.038181+0000","objects_scrubbed":0,"log_size":7847,"log_dups_size":0,"ondisk_log_size":7847,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T00:23:50.343895+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00030837699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":25629,"num_read_kb":40418,"num_write":10625,"num_write_kb":1113062,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"246'6994","reported_seq":11452,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:41.989381+0000","last_change":"2026-03-24T17:34:41.989321+0000","last_active":"2026-03-24T17:34:41.989381+0000","last_peered":"2026-03-24T17:34:41.989381+0000","last_clean":"2026-03-24T17:34:41.989381+0000","last_became_active":"2026-03-24T16:53:01.823211+0000","last_became_peered":"2026-03-24T16:53:01.823211+0000","last_unstale":"2026-03-24T17:34:41.989381+0000","last_undegraded":"2026-03-24T17:34:41.989381+0000","last_fullsized":"2026-03-24T17:34:41.989381+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"246'6994","last_scrub_stamp":"2026-03-24T17:34:41.989264+0000","last_deep_scrub":"246'6994","last_deep_scrub_stamp":"2026-03-24T17:34:41.989264+0000","last_clean_scrub_stamp":"2026-03-24T17:34:41.989264+0000","objects_scrubbed":2,"log_size":6994,"log_dups_size":0,"ondisk_log_size":6994,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:10:02.274832+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00071319500000000002,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":7479,"num_read_kb":28407,"num_write":5401,"num_write_kb":1218895,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"246'5028","reported_seq":11088,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:36.475843+0000","last_change":"2026-03-24T17:34:36.475803+0000","last_active":"2026-03-24T17:34:36.475843+0000","last_peered":"2026-03-24T17:34:36.475843+0000","last_clean":"2026-03-24T17:34:36.475843+0000","last_became_active":"2026-03-24T16:53:01.823342+0000","last_became_peered":"2026-03-24T16:53:01.823342+0000","last_unstale":"2026-03-24T17:34:36.475843+0000","last_undegraded":"2026-03-24T17:34:36.475843+0000","last_fullsized":"2026-03-24T17:34:36.475843+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"246'5028","last_scrub_stamp":"2026-03-24T17:34:36.475749+0000","last_deep_scrub":"246'5028","last_deep_scrub_stamp":"2026-03-24T17:34:36.475749+0000","last_clean_scrub_stamp":"2026-03-24T17:34:36.475749+0000","objects_scrubbed":2,"log_size":5028,"log_dups_size":0,"ondisk_log_size":5028,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T23:25:31.964309+0000","scrub_duration":17,"objects_trimmed":0,"snaptrim_duration":0.00051557600000000001,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":9647,"num_read_kb":26613,"num_write":4477,"num_write_kb":1192636,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"246'4742","reported_seq":9393,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:37.959213+0000","last_change":"2026-03-24T17:34:37.959187+0000","last_active":"2026-03-24T17:34:37.959213+0000","last_peered":"2026-03-24T17:34:37.959213+0000","last_clean":"2026-03-24T17:34:37.959213+0000","last_became_active":"2026-03-24T16:53:01.823089+0000","last_became_peered":"2026-03-24T16:53:01.823089+0000","last_unstale":"2026-03-24T17:34:37.959213+0000","last_undegraded":"2026-03-24T17:34:37.959213+0000","last_fullsized":"2026-03-24T17:34:37.959213+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"246'4742","last_scrub_stamp":"2026-03-24T17:34:37.959148+0000","last_deep_scrub":"246'4742","last_deep_scrub_stamp":"2026-03-24T17:34:37.959148+0000","last_clean_scrub_stamp":"2026-03-24T17:34:37.959148+0000","objects_scrubbed":2,"log_size":4742,"log_dups_size":0,"ondisk_log_size":4742,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T02:21:58.762436+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00026306099999999998,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10556,"num_read_kb":31959,"num_write":4465,"num_write_kb":1068977,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"237'6037","reported_seq":9550,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:36.912038+0000","last_change":"2026-03-24T17:34:36.912038+0000","last_active":"2026-03-24T17:34:36.912038+0000","last_peered":"2026-03-24T17:34:36.912038+0000","last_clean":"2026-03-24T17:34:36.912038+0000","last_became_active":"2026-03-24T16:53:01.822601+0000","last_became_peered":"2026-03-24T16:53:01.822601+0000","last_unstale":"2026-03-24T17:34:36.912038+0000","last_undegraded":"2026-03-24T17:34:36.912038+0000","last_fullsized":"2026-03-24T17:34:36.912038+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"237'6037","last_scrub_stamp":"2026-03-24T17:34:36.911990+0000","last_deep_scrub":"237'6037","last_deep_scrub_stamp":"2026-03-24T17:34:36.911990+0000","last_clean_scrub_stamp":"2026-03-24T17:34:36.911990+0000","objects_scrubbed":0,"log_size":6037,"log_dups_size":0,"ondisk_log_size":6037,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-26T03:12:20.257303+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.00018387399999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7404,"num_read_kb":25396,"num_write":5157,"num_write_kb":1049840,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"246'6765","reported_seq":13190,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:38.047980+0000","last_change":"2026-03-24T17:34:38.047952+0000","last_active":"2026-03-24T17:34:38.047980+0000","last_peered":"2026-03-24T17:34:38.047980+0000","last_clean":"2026-03-24T17:34:38.047980+0000","last_became_active":"2026-03-24T16:53:01.825199+0000","last_became_peered":"2026-03-24T16:53:01.825199+0000","last_unstale":"2026-03-24T17:34:38.047980+0000","last_undegraded":"2026-03-24T17:34:38.047980+0000","last_fullsized":"2026-03-24T17:34:38.047980+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"246'6765","last_scrub_stamp":"2026-03-24T17:34:38.047914+0000","last_deep_scrub":"246'6765","last_deep_scrub_stamp":"2026-03-24T17:34:38.047914+0000","last_clean_scrub_stamp":"2026-03-24T17:34:38.047914+0000","objects_scrubbed":1,"log_size":6765,"log_dups_size":0,"ondisk_log_size":6765,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:54:59.027211+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00068371000000000003,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":9856,"num_read_kb":29410,"num_write":5473,"num_write_kb":1086668,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"246'34","reported_seq":553,"reported_epoch":246,"state":"active+clean","last_fresh":"2026-03-24T17:34:37.008173+0000","last_change":"2026-03-24T17:34:37.008173+0000","last_active":"2026-03-24T17:34:37.008173+0000","last_peered":"2026-03-24T17:34:37.008173+0000","last_clean":"2026-03-24T17:34:37.008173+0000","last_became_active":"2026-03-24T16:52:58.801011+0000","last_became_peered":"2026-03-24T16:52:58.801011+0000","last_unstale":"2026-03-24T17:34:37.008173+0000","last_undegraded":"2026-03-24T17:34:37.008173+0000","last_fullsized":"2026-03-24T17:34:37.008173+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"246'34","last_scrub_stamp":"2026-03-24T17:34:37.008136+0000","last_deep_scrub":"246'34","last_deep_scrub_stamp":"2026-03-24T17:34:37.008136+0000","last_clean_scrub_stamp":"2026-03-24T17:34:37.008136+0000","objects_scrubbed":2,"log_size":34,"log_dups_size":0,"ondisk_log_size":34,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:27:39.476351+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84949,"num_read_kb":234411,"num_write":44408,"num_write_kb":8897529,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1309208,"internal_metadata":0},"log_size":48043,"ondisk_log_size":48043,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":5428,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":34,"ondisk_log_size":34,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738875,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":877084,"kb_used_data":2060,"kb_used_omap":369,"kb_used_meta":874638,"kb_avail":93494756,"statfs":{"total":96636764160,"available":95738630144,"internally_reserved":0,"allocated":2109440,"data_stored":958213,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":378335,"internal_metadata":895629857},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738873,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1021712,"kb_used_data":2304,"kb_used_omap":550,"kb_used_meta":1018841,"kb_avail":93350128,"statfs":{"total":96636764160,"available":95590531072,"internally_reserved":0,"allocated":2359296,"data_stored":1417520,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":564208,"internal_metadata":1043293200},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738876,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1057808,"kb_used_data":2304,"kb_used_omap":390,"kb_used_meta":1055097,"kb_avail":93314032,"statfs":{"total":96636764160,"available":95553568768,"internally_reserved":0,"allocated":2359296,"data_stored":1417520,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":399433,"internal_metadata":1080420279},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":2714,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":397843,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":544054,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":367311,"internal_metadata":0}]}} 2026-03-24T17:34:57.149 DEBUG:teuthology.orchestra.run.vm01:> sudo ceph --cluster ceph config set global mon_health_to_clog false 2026-03-24T17:34:57.332 INFO:teuthology.misc:Shutting down mds daemons... 2026-03-24T17:34:57.332 INFO:teuthology.misc:Shutting down osd daemons... 2026-03-24T17:34:57.332 DEBUG:tasks.ceph.osd.0:waiting for process to exit 2026-03-24T17:34:57.332 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T17:34:57.424 INFO:tasks.ceph.osd.0:Stopped 2026-03-24T17:34:57.425 DEBUG:tasks.ceph.osd.1:waiting for process to exit 2026-03-24T17:34:57.425 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T17:34:57.517 INFO:tasks.ceph.osd.1:Stopped 2026-03-24T17:34:57.517 DEBUG:tasks.ceph.osd.2:waiting for process to exit 2026-03-24T17:34:57.517 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T17:34:57.612 INFO:tasks.ceph.osd.2:Stopped 2026-03-24T17:34:57.612 INFO:teuthology.misc:Shutting down mgr daemons... 2026-03-24T17:34:57.612 DEBUG:tasks.ceph.mgr.x:waiting for process to exit 2026-03-24T17:34:57.612 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T17:34:57.641 INFO:tasks.ceph.mgr.x:Stopped 2026-03-24T17:34:57.641 INFO:teuthology.misc:Shutting down mon daemons... 2026-03-24T17:34:57.641 DEBUG:tasks.ceph.mon.a:waiting for process to exit 2026-03-24T17:34:57.641 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T17:34:57.696 INFO:tasks.ceph.mon.a:Stopped 2026-03-24T17:34:57.696 INFO:tasks.ceph:Checking cluster log for badness... 2026-03-24T17:34:57.696 DEBUG:teuthology.orchestra.run.vm01:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v '\(OSD_SLOW_PING_TIME' | head -n 1 2026-03-24T17:34:57.747 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-0 on ubuntu@vm01.local 2026-03-24T17:34:57.747 DEBUG:teuthology.orchestra.run.vm01:> sync && sudo umount -f /var/lib/ceph/osd/ceph-0 2026-03-24T17:34:57.847 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-1 on ubuntu@vm01.local 2026-03-24T17:34:57.847 DEBUG:teuthology.orchestra.run.vm01:> sync && sudo umount -f /var/lib/ceph/osd/ceph-1 2026-03-24T17:34:57.894 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-2 on ubuntu@vm01.local 2026-03-24T17:34:57.894 DEBUG:teuthology.orchestra.run.vm01:> sync && sudo umount -f /var/lib/ceph/osd/ceph-2 2026-03-24T17:34:57.943 INFO:tasks.ceph:Archiving mon data... 2026-03-24T17:34:57.944 DEBUG:teuthology.misc:Transferring archived files from vm01:/var/lib/ceph/mon/ceph-a to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621/data/mon.a.tgz 2026-03-24T17:34:57.944 DEBUG:teuthology.orchestra.run.vm01:> mktemp 2026-03-24T17:34:57.947 INFO:teuthology.orchestra.run.vm01.stdout:/tmp/tmp.rYRgM0Pxvh 2026-03-24T17:34:57.947 DEBUG:teuthology.orchestra.run.vm01:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-a -- . > /tmp/tmp.rYRgM0Pxvh 2026-03-24T17:34:58.047 DEBUG:teuthology.orchestra.run.vm01:> sudo chmod 0666 /tmp/tmp.rYRgM0Pxvh 2026-03-24T17:34:58.102 DEBUG:teuthology.orchestra.remote:vm01:/tmp/tmp.rYRgM0Pxvh is 419KB 2026-03-24T17:34:58.150 DEBUG:teuthology.orchestra.run.vm01:> rm -fr /tmp/tmp.rYRgM0Pxvh 2026-03-24T17:34:58.153 INFO:tasks.ceph:Cleaning ceph cluster... 2026-03-24T17:34:58.153 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -rf -- /etc/ceph/ceph.conf /etc/ceph/ceph.keyring /home/ubuntu/cephtest/ceph.data /home/ubuntu/cephtest/ceph.monmap /home/ubuntu/cephtest/../*.pid 2026-03-24T17:34:58.244 INFO:teuthology.util.scanner:summary_data or yaml_file is empty! 2026-03-24T17:34:58.244 INFO:tasks.ceph:Archiving crash dumps... 2026-03-24T17:34:58.244 DEBUG:teuthology.misc:Transferring archived files from vm01:/var/lib/ceph/crash to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621/remote/vm01/crash 2026-03-24T17:34:58.244 DEBUG:teuthology.orchestra.run.vm01:> sudo tar c -f - -C /var/lib/ceph/crash -- . 2026-03-24T17:34:58.292 INFO:tasks.ceph:Compressing logs... 2026-03-24T17:34:58.292 DEBUG:teuthology.orchestra.run.vm01:> time sudo find /var/log/ceph -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-24T17:34:58.341 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84448.log 2026-03-24T17:34:58.341 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90161.log 2026-03-24T17:34:58.341 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39150.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84448.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74931.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84448.log.gz/var/log/ceph/ceph-client.admin.90161.log: 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90161.log.gz/var/log/ceph/ceph-client.admin.39150.log: 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61714.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39150.log.gz 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74931.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72992.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74931.log.gz 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64787.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61714.log.gz 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72992.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76799.log 2026-03-24T17:34:58.342 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72992.log.gz 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64787.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64787.log.gz 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58062.log 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60972.log 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76799.log.gz 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58062.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48429.log 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58062.log.gz 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26980.log 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60972.log.gz 2026-03-24T17:34:58.343 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55603.log 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48429.log: /var/log/ceph/ceph-client.admin.26980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48429.log.gz 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67598.log 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26980.log.gz 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55603.log.gz 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76244.log 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67598.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86751.log 2026-03-24T17:34:58.344 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67598.log.gz 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76244.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89361.log 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76244.log.gz 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67727.log 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86751.log.gz 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27954.log 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89361.log.gz/var/log/ceph/ceph-client.admin.67727.log: 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27349.log 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67727.log.gz 2026-03-24T17:34:58.345 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27954.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81034.log 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27954.log.gz 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34899.log 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27349.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25727.log 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81034.log: 0.0% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27349.log.gz 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.81034.log.gz 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34899.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78000.log 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34899.log.gz 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25727.log.gz 2026-03-24T17:34:58.346 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82092.log 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61697.log 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78000.log.gz 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82092.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75621.log 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82092.log.gz 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61697.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39276.log 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61697.log.gz 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84467.log 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75621.log.gz 2026-03-24T17:34:58.347 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70687.log 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39276.log.gz 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84467.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27065.log 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84467.log.gz 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70687.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71868.log 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70687.log.gz 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72187.log 2026-03-24T17:34:58.348 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27065.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36013.log 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71868.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71868.log.gz 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72187.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87074.log 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72187.log.gz 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27065.log.gz 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36013.log.gz 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67899.log 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77539.log 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87074.log.gz 2026-03-24T17:34:58.349 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70644.log 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67899.log.gz 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77539.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67512.log 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77539.log.gz 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70644.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.86730.log 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70644.log.gz 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67512.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48997.log 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67512.log.gz 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86730.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87095.log 2026-03-24T17:34:58.350 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86730.log.gz 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48997.log.gz 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90635.log 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87095.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63648.log 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87095.log.gz 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90635.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73299.log 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90635.log.gz 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63648.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39470.log 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63648.log.gz 2026-03-24T17:34:58.351 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59391.log 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73299.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73299.log.gz 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75361.log 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39470.log.gz 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59391.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89125.log 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59391.log.gz 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75361.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86017.log 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75361.log.gz 2026-03-24T17:34:58.352 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37546.log 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89125.log.gz 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89082.log 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86017.log.gz 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37546.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89555.log 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89082.log.gz 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37546.log.gz 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55775.log 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89555.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42959.log 2026-03-24T17:34:58.353 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89555.log.gz 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55775.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78651.log 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55775.log.gz 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42959.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57600.log 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr: 58.0% -- replaced with /var/log/ceph/ceph-client.admin.42959.log.gz 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78651.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76756.log 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.78651.log.gz 2026-03-24T17:34:58.354 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57600.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69227.log 2026-03-24T17:34:58.355 INFO:teuthology.orchestra.run.vm01.stderr: 43.2% -- replaced with /var/log/ceph/ceph-client.admin.57600.log.gz 2026-03-24T17:34:58.355 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76756.log.gz 2026-03-24T17:34:58.355 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79952.log 2026-03-24T17:34:58.355 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90609.log 2026-03-24T17:34:58.355 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69227.log: 0.0%/var/log/ceph/ceph-client.admin.79952.log: -- replaced with /var/log/ceph/ceph-client.admin.69227.log.gz 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69722.log 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79952.log.gz 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64767.log 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90609.log.gz 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69722.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74474.log 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69722.log.gz 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34017.log 2026-03-24T17:34:58.356 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64767.log: /var/log/ceph/ceph-client.admin.74474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58996.log 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74474.log.gz 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64767.log.gz/var/log/ceph/ceph-client.admin.34017.log: 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35536.log 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34017.log.gz 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89979.log 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58996.log: /var/log/ceph/ceph-client.admin.35536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58996.log.gz 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35536.log.gz 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59758.log 2026-03-24T17:34:58.357 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32041.log 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89979.log: /var/log/ceph/ceph-client.admin.59758.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70171.log 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59758.log.gz 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89979.log.gz 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32041.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89576.log 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27311.log 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32041.log.gz 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70171.log: /var/log/ceph/ceph-client.admin.89576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70171.log.gz 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89576.log.gz 2026-03-24T17:34:58.358 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42447.log 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26760.log 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27311.log: /var/log/ceph/ceph-client.admin.42447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80782.log 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26760.log: 15.7% -- replaced with /var/log/ceph/ceph-client.admin.42447.log.gz 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26760.log.gz 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr: 1.2%gzip -- replaced with /var/log/ceph/ceph-client.admin.27311.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.73771.log 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.359 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80782.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71194.log 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80782.log.gz 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73771.log.gz 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61080.log 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87583.log 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71194.log.gz 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61080.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89017.log 2026-03-24T17:34:58.360 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61080.log.gz 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87583.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87518.log 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87583.log.gz 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59573.log 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89017.log.gz 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87518.log.gz 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65608.log 2026-03-24T17:34:58.361 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71814.log 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59573.log.gz 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65608.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71232.log 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65608.log.gz 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71814.log: gzip -5 --verbose -- /var/log/ceph/ceph.audit.log 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71814.log.gz 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62162.log 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71232.log.gz 2026-03-24T17:34:58.362 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph.audit.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56404.log 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49944.log 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62162.log.gz 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56404.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68450.log 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56404.log.gz 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49944.log.gz 2026-03-24T17:34:58.363 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62744.log 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68450.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63060.log 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr: 89.9% -- replaced with /var/log/ceph/ceph.audit.log.gz 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68450.log.gz 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83337.log 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62744.log: /var/log/ceph/ceph-client.admin.63060.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42464.log 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63060.log.gz 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62744.log.gz 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83337.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61037.log 2026-03-24T17:34:58.364 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83337.log.gz 2026-03-24T17:34:58.365 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42464.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71030.log 2026-03-24T17:34:58.365 INFO:teuthology.orchestra.run.vm01.stderr: 54.1% -- replaced with /var/log/ceph/ceph-client.admin.42464.log.gz 2026-03-24T17:34:58.365 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40640.log 2026-03-24T17:34:58.365 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61037.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61037.log.gz/var/log/ceph/ceph-client.admin.71030.log: 2026-03-24T17:34:58.365 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55216.log 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71030.log.gz 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40640.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90531.log 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40640.log.gz 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55216.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59738.log 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55216.log.gz 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79692.log 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90531.log.gz 2026-03-24T17:34:58.366 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59738.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39728.log 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59738.log.gz 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79692.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25823.log 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79692.log.gz 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39728.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70838.log 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39728.log.gz 2026-03-24T17:34:58.367 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78739.log 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25823.log.gz 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70838.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81964.log 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70838.log.gz 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78739.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78276.log 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78739.log.gz 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81964.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45317.log 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81964.log.gz 2026-03-24T17:34:58.368 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64378.log 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78276.log.gz 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45317.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65417.log 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45317.log.gz 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64378.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81362.log 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64378.log.gz 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65417.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68639.log 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65417.log.gz 2026-03-24T17:34:58.369 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53578.log 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81362.log.gz 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68639.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34317.log 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68639.log.gz 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53578.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32007.log 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53578.log.gz 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34317.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60306.log 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34317.log.gz 2026-03-24T17:34:58.370 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53782.log 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32007.log: /var/log/ceph/ceph-client.admin.60306.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76201.log 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60306.log.gz 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32007.log.gz 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53782.log.gz 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27179.log 2026-03-24T17:34:58.371 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27761.log 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76201.log.gz 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27179.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70236.log 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27179.log.gz 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27761.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70859.log 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27761.log.gz 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60499.log 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70236.log.gz 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70859.log.gz 2026-03-24T17:34:58.372 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34959.log 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83509.log 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60499.log.gz 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34959.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76421.log 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34959.log.gz 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83509.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56854.log 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83509.log.gz 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76421.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73825.log 2026-03-24T17:34:58.373 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76421.log.gz 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29820.log 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56854.log.gz 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33012.log 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73825.log.gz 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47435.log 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29820.log.gz 2026-03-24T17:34:58.374 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33012.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32535.log 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33012.log.gz 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84245.log 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47435.log.gz 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32535.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60714.log 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84245.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32535.log.gz 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84245.log.gz 2026-03-24T17:34:58.375 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77439.log 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60714.log.gz 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57319.log 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61230.log 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77439.log.gz/var/log/ceph/ceph-client.admin.57319.log: 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85738.log 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57319.log.gz 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61230.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71434.log 2026-03-24T17:34:58.376 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61230.log.gz 2026-03-24T17:34:58.377 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82924.log 2026-03-24T17:34:58.377 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85738.log.gz 2026-03-24T17:34:58.377 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77358.log 2026-03-24T17:34:58.377 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71434.log: /var/log/ceph/ceph-client.admin.82924.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49255.log 2026-03-24T17:34:58.377 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82924.log.gz 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71434.log.gz 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73750.log 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77358.log.gz 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36081.log 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49255.log.gz 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53118.log 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73750.log.gz 2026-03-24T17:34:58.378 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28385.log 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36081.log.gz 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53118.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36491.log 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53118.log.gz 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75901.log 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28385.log.gz 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58191.log 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36491.log.gz 2026-03-24T17:34:58.379 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75901.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62003.log 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75901.log.gz 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58191.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34457.log 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58191.log.gz 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56768.log 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62003.log.gz 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59594.log 2026-03-24T17:34:58.380 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34457.log.gz 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56768.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.56768.log.gz -5 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.49557.log 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.59594.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.27740.log 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59594.log.gz 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49557.log: /var/log/ceph/ceph-client.admin.27740.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77315.log 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49557.log.gz 2026-03-24T17:34:58.381 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.27740.log.gz 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58801.log 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77315.log.gz 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85802.log 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58801.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49405.log 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58801.log.gz 2026-03-24T17:34:58.382 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5/var/log/ceph/ceph-client.admin.85802.log: --verbose -- /var/log/ceph/ceph-client.admin.64535.log 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85802.log.gz 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49405.log.gz 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39234.log 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61600.log 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64535.log.gz 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39234.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34217.log 2026-03-24T17:34:58.383 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39234.log.gz 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61600.log.gz 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54656.log 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34217.log.gz 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87397.log 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66231.log 2026-03-24T17:34:58.384 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54656.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54656.log.gz 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87397.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30681.log 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87397.log.gz 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89770.log 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66231.log.gz 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30681.log.gz 2026-03-24T17:34:58.385 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83638.log 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.89770.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.61316.log 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89770.log.gz 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83638.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77473.log 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83638.log.gz 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.61316.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.86988.log 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61316.log.gz 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77473.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58341.log 2026-03-24T17:34:58.386 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77473.log.gz 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35039.log 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86988.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86988.log.gz/var/log/ceph/ceph-client.admin.58341.log: 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77490.log 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58341.log.gz 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82007.log 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35039.log.gz 2026-03-24T17:34:58.387 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77490.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89189.log 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77490.log.gz 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87296.log 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82007.log: 0.0%/var/log/ceph/ceph-client.admin.89189.log: -- replaced with /var/log/ceph/ceph-client.admin.82007.log.gz 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89189.log.gz 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84068.log 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.87296.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.59161.log 2026-03-24T17:34:58.388 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.87296.log.gz 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84068.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48283.log 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84068.log.gz 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89060.log 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59161.log.gz 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48283.log.gz 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41429.log 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83660.log 2026-03-24T17:34:58.389 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89060.log.gz 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41429.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76928.log 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41429.log.gz 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74630.log 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83660.log.gz/var/log/ceph/ceph-client.admin.76928.log: 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76928.log.gz 2026-03-24T17:34:58.390 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82687.log 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86493.log 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74630.log.gz 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33497.log 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82687.log.gz/var/log/ceph/ceph-client.admin.86493.log: 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62878.log 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86493.log.gz 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33497.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49686.log 2026-03-24T17:34:58.391 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33497.log.gz 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55345.log 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62878.log.gz 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67749.log 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49686.log: /var/log/ceph/ceph-client.admin.55345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49686.log.gz 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59333.log 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55345.log.gz 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67749.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70515.log 2026-03-24T17:34:58.392 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67749.log.gz 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83246.log 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59333.log.gz 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72702.log 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70515.log: /var/log/ceph/ceph-client.admin.83246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70515.log.gz 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83246.log.gz 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63316.log 2026-03-24T17:34:58.393 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72702.log.gz 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41638.log 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30250.log 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63316.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63316.log.gz/var/log/ceph/ceph-client.admin.41638.log: 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62202.log 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41638.log.gz 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30250.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83160.log 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30250.log.gz 2026-03-24T17:34:58.394 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32893.log 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62202.log: /var/log/ceph/ceph-client.admin.83160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62202.log.gz 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83160.log.gz 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75578.log 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32893.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45381.log 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75578.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33597.log 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75578.log.gz 2026-03-24T17:34:58.395 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32893.log.gz 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45381.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33737.log 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45381.log.gz 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67168.log 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33597.log.gz 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32791.log 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33737.log.gz 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67168.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54198.log 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67168.log.gz 2026-03-24T17:34:58.396 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32791.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27847.log 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32791.log.gz 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82255.log 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54198.log.gz 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37840.log 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27847.log.gz 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82255.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54240.log 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr: 86.6% -- replaced with /var/log/ceph/ceph-client.admin.82255.log.gz 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37840.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46713.log 2026-03-24T17:34:58.397 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37840.log.gz 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43107.log 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54240.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82075.log 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46713.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.54240.log.gz 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46713.log.gz 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43107.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78161.log 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43107.log.gz 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82075.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64924.log 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82075.log.gz 2026-03-24T17:34:58.398 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60542.log 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78161.log.gz 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74560.log 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64924.log.gz 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60542.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43065.log 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60542.log.gz 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74560.log.gz 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87481.log 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71590.log 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43065.log.gz 2026-03-24T17:34:58.399 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87481.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86859.log 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87481.log.gz 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71590.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64419.log 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71590.log.gz 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35199.log 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86859.log.gz 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30489.log 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64419.log: 0.0%/var/log/ceph/ceph-client.admin.35199.log: -- replaced with /var/log/ceph/ceph-client.admin.64419.log.gz 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73280.log 2026-03-24T17:34:58.400 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35199.log.gz 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30489.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61870.log 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30489.log.gz 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71414.log 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73280.log.gz 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72122.log 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61870.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77688.log 2026-03-24T17:34:58.401 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61870.log.gz 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71414.log: /var/log/ceph/ceph-client.admin.72122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71414.log.gz 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49276.log 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72122.log.gz 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77688.log.gz 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42327.log 2026-03-24T17:34:58.402 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30815.log 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49276.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33217.log 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42327.log: /var/log/ceph/ceph-client.admin.30815.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25728.log 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30815.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.42327.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33217.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79283.log 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53398.log 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33217.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25728.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79283.log.gz 2026-03-24T17:34:58.403 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52580.log 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54421.log 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53398.log: /var/log/ceph/ceph-client.admin.52580.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88652.log 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr: 28.9% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52580.log.gz 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.53398.log.gz 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54421.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81138.log 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54421.log.gz 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30605.log 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88652.log.gz 2026-03-24T17:34:58.404 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32808.log 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81138.log.gz 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30605.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83595.log 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30605.log.gz 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32808.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77079.log 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32808.log.gz 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27890.log 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83595.log.gz 2026-03-24T17:34:58.405 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83918.log 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77079.log.gz/var/log/ceph/ceph-client.admin.27890.log: 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90765.log 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27890.log.gz 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83918.log.gz 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41493.log 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45038.log 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90765.log.gz 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75879.log 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41493.log.gz 2026-03-24T17:34:58.406 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45038.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59017.log 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45038.log.gz 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75879.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47891.log 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75879.log.gz 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48760.log 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59017.log.gz 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48676.log 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47891.log.gz 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48760.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27595.log 2026-03-24T17:34:58.407 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48760.log.gz 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48676.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67383.log 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48676.log.gz 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30272.log 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27595.log.gz 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67383.log.gz 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36286.log 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56364.log 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30272.log.gz 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60177.log 2026-03-24T17:34:58.408 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36286.log.gz 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56364.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77514.log 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56364.log.gz 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70666.log 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60177.log.gz 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77514.log.gz 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67813.log 2026-03-24T17:34:58.409 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70666.log.gz 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39749.log 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67813.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67813.log.gz 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82945.log 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36814.log 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39749.log.gz 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82945.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78581.log 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82945.log.gz 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36814.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46185.log 2026-03-24T17:34:58.410 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36814.log.gz 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51327.log 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78581.log.gz 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65866.log 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46185.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46185.log.gz 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51327.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58977.log 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.51327.log.gz 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65866.log.gz 2026-03-24T17:34:58.411 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56940.log 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58977.log.gz 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34979.log 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76051.log 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56940.log.gz 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34979.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64480.log 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34979.log.gz 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76051.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75707.log 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76051.log.gz 2026-03-24T17:34:58.412 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56489.log 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64480.log.gz 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90739.log 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75707.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75707.log.gz 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56489.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30143.log 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56489.log.gz 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90739.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77165.log 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90739.log.gz 2026-03-24T17:34:58.413 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50533.log 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30143.log.gz 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77165.log.gz 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86081.log 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69292.log 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50533.log.gz 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73534.log 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86081.log.gz 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69292.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79974.log 2026-03-24T17:34:58.414 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69292.log.gz 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62918.log 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73534.log.gz 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61928.log 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79974.log.gz 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62918.log.gz 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56216.log 2026-03-24T17:34:58.415 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62978.log 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61928.log: /var/log/ceph/ceph-client.admin.56216.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68575.log 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56216.log.gz 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61928.log.gz 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62978.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42649.log 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62978.log.gz 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68575.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73457.log 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68575.log.gz 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50919.log 2026-03-24T17:34:58.416 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42649.log: /var/log/ceph/ceph-client.admin.73457.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84510.log 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42649.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73457.log.gz 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50919.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31135.log 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50919.log.gz 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35179.log 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84510.log.gz 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31135.log.gz 2026-03-24T17:34:58.417 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44866.log 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35179.log.gz 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33303.log 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44866.log.gz 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44973.log 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57400.log 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33303.log: /var/log/ceph/ceph-client.admin.44973.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38646.log 2026-03-24T17:34:58.418 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44973.log.gz 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33303.log.gz 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57400.log.gz 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58707.log 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66846.log 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38646.log: /var/log/ceph/ceph-client.admin.58707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60822.log 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58707.log.gz 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.38646.log.gz/var/log/ceph/ceph-client.admin.66846.log: 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54400.log 2026-03-24T17:34:58.419 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66846.log.gz 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48409.log 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60822.log: /var/log/ceph/ceph-client.admin.54400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60822.log.gz 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54400.log.gz 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36644.log 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57766.log 2026-03-24T17:34:58.420 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48409.log: /var/log/ceph/ceph-client.admin.36644.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26832.log 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48409.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36644.log.gz 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57766.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46790.log 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57766.log.gz 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87627.log 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26832.log.gz 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78859.log 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87627.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43720.log 2026-03-24T17:34:58.421 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87627.log.gz 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46790.log: /var/log/ceph/ceph-client.admin.78859.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84272.log 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78859.log.gz 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46790.log.gz 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52924.log 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43720.log: /var/log/ceph/ceph-client.admin.84272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43720.log.gz 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84272.log.gz 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41257.log 2026-03-24T17:34:58.422 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34197.log 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52924.log: /var/log/ceph/ceph-client.admin.41257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43501.log 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41257.log.gz 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.52924.log.gz 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34197.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64201.log 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34197.log.gz 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49362.log 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43501.log: /var/log/ceph/ceph-client.admin.64201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43501.log.gz 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64201.log.gz 2026-03-24T17:34:58.423 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29176.log 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88437.log 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49362.log: /var/log/ceph/ceph-client.admin.29176.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61579.log 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29176.log.gz 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.49362.log.gz 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88437.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32518.log 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88437.log.gz 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74263.log 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61579.log: /var/log/ceph/ceph-client.admin.32518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61579.log.gz 2026-03-24T17:34:58.424 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55861.log 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32518.log.gz 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55280.log 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74263.log: /var/log/ceph/ceph-client.admin.55861.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67921.log 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55861.log.gz 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74263.log.gz 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55280.log.gz 2026-03-24T17:34:58.425 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85133.log 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71454.log 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67921.log.gz 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70730.log 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85133.log: /var/log/ceph/ceph-client.admin.71454.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48655.log 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71454.log.gz 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85133.log.gz 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70730.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84746.log 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70730.log.gz 2026-03-24T17:34:58.426 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48655.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31628.log 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48655.log.gz 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79098.log 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84746.log: /var/log/ceph/ceph-client.admin.31628.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26481.log 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31628.log.gz 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84746.log.gz 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79098.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46307.log 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79098.log.gz 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26481.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54634.log 2026-03-24T17:34:58.427 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26481.log.gz 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39448.log 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46307.log: /var/log/ceph/ceph-client.admin.54634.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67018.log 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.46307.log.gz 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.54634.log.gz 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39448.log.gz 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51776.log 2026-03-24T17:34:58.428 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67018.log.gz 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58320.log 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56384.log 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51776.log: /var/log/ceph/ceph-client.admin.58320.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27614.log 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58320.log.gz 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.51776.log.gz 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78056.log 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56384.log.gz 2026-03-24T17:34:58.429 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27614.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63752.log 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27614.log.gz 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38688.log 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78056.log: /var/log/ceph/ceph-client.admin.63752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74184.log 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63752.log.gz 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78056.log.gz 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38688.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33132.log 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.38688.log.gz 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26962.log 2026-03-24T17:34:58.430 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74184.log.gz 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77057.log 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33132.log: /var/log/ceph/ceph-client.admin.26962.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63431.log 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26962.log.gz 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33132.log.gz 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77057.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77057.log.gz 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35570.log 2026-03-24T17:34:58.431 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63431.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65225.log 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63431.log.gz 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35570.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78108.log 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35570.log.gz 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65225.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40322.log 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65225.log.gz 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68762.log 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78108.log.gz 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32586.log 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40322.log.gz 2026-03-24T17:34:58.432 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46535.log 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.68762.log.gz 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32586.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40564.log 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32586.log.gz 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77591.log 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46535.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68492.log 2026-03-24T17:34:58.433 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40564.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.46535.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.77591.log: -- replaced with /var/log/ceph/ceph-client.admin.40564.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48590.log 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77591.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68492.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86235.log 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68492.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70322.log 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48590.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50240.log 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86235.log.gz 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70322.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63508.log 2026-03-24T17:34:58.434 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70322.log.gz 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50240.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76456.log 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50240.log.gz 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54032.log 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63508.log.gz 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70451.log 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76456.log.gz 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54032.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25589.log 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54032.log.gz 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70451.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86644.log 2026-03-24T17:34:58.435 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70451.log.gz 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44123.log 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25589.log.gz 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66167.log 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86644.log.gz/var/log/ceph/ceph-client.admin.44123.log: 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36030.log 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66167.log.gz 2026-03-24T17:34:58.436 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.44123.log.gz 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27933.log 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44365.log 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36030.log.gz 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27933.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52903.log 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27933.log.gz 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44365.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76821.log 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52903.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.44365.log.gz 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52903.log.gz 2026-03-24T17:34:58.437 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75103.log 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76821.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60327.log 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76821.log.gz 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75103.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45059.log 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75103.log.gz 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60327.log.gz 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81120.log 2026-03-24T17:34:58.438 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45059.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87202.log 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45059.log.gz 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81120.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41557.log 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81120.log.gz 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87202.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59844.log 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr: 28.9% -- replaced with /var/log/ceph/ceph-client.admin.87202.log.gz 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63628.log 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41557.log.gz 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89146.log 2026-03-24T17:34:58.439 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59844.log.gz 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63628.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35894.log 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63628.log.gz 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89146.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76348.log 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89146.log.gz 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48203.log 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35894.log.gz 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30920.log 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76348.log.gz 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.48203.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.83767.log 2026-03-24T17:34:58.440 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48203.log.gz 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30920.log.gz 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42228.log 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83767.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58535.log 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83767.log.gz 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.42228.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.62336.log 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88125.log 2026-03-24T17:34:58.441 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58535.log: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.42228.log.gz 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58535.log.gz 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62336.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62336.log.gz -5 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.86816.log 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88125.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76403.log 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88125.log.gz 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86816.log.gz 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45210.log 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76403.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66638.log 2026-03-24T17:34:58.442 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76403.log.gz 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45210.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34337.log 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45210.log.gz 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66638.log.gz 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45535.log 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61166.log 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34337.log.gz 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45535.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72802.log 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45535.log.gz 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61166.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76692.log 2026-03-24T17:34:58.443 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61166.log.gz 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50492.log 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72802.log.gz 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59078.log 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76692.log.gz 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50492.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33234.log 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50492.log.gz 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59078.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83445.log 2026-03-24T17:34:58.444 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59078.log.gz 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67104.log 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33234.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81684.log 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83445.log.gz 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33234.log.gz 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67104.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40600.log 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67104.log.gz 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81684.log.gz 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28192.log 2026-03-24T17:34:58.445 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34657.log 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40600.log.gz 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28192.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43823.log 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28192.log.gz 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34657.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36439.log 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34657.log.gz 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29734.log 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43823.log: /var/log/ceph/ceph-client.admin.36439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45750.log 2026-03-24T17:34:58.446 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36439.log.gz 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43823.log.gz 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29734.log.gz 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66348.log 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45750.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45750.log.gz -5 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.42981.log 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66348.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69335.log 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66348.log.gz 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42981.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68430.log 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42981.log.gz 2026-03-24T17:34:58.447 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69335.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48303.log 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69335.log.gz 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46984.log 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68430.log: gzip/var/log/ceph/ceph-client.admin.48303.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.59371.log 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr: 12.9% -- replaced with /var/log/ceph/ceph-client.admin.68430.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48303.log.gz 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46984.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77556.log 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46984.log.gz 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55259.log 2026-03-24T17:34:58.448 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59371.log.gz 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85587.log 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77556.log.gz 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55259.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75038.log 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55259.log.gz 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85587.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48077.log 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85587.log.gz 2026-03-24T17:34:58.449 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75038.log.gz 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76227.log 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48077.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58664.log 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48077.log.gz 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76227.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90296.log 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76227.log.gz 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58664.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32705.log 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58664.log.gz 2026-03-24T17:34:58.450 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30336.log 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90296.log.gz 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44930.log 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32705.log: /var/log/ceph/ceph-client.admin.30336.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31393.log 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30336.log.gz 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32705.log.gz 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37001.log 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44930.log.gz 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84025.log 2026-03-24T17:34:58.451 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31393.log.gz 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37052.log 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37001.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37001.log.gz 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84025.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45922.log 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84025.log.gz 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37052.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57579.log 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37052.log.gz 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27825.log 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45922.log.gz 2026-03-24T17:34:58.452 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28514.log 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57579.log: /var/log/ceph/ceph-client.admin.27825.log: 49.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26309.log 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27825.log.gz 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.57579.log.gz 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28514.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55000.log 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28514.log.gz 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55495.log 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26309.log: /var/log/ceph/ceph-client.admin.55000.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26309.log.gz 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41321.log 2026-03-24T17:34:58.453 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55000.log.gz 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52838.log 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55495.log.gz 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41321.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32842.log 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41321.log.gz 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49750.log 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52838.log: /var/log/ceph/ceph-client.admin.32842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52838.log.gz 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29627.log 2026-03-24T17:34:58.454 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32842.log.gz 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49750.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85286.log 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29627.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36610.log 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29627.log.gz 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75815.log 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85286.log.gz 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30434.log 2026-03-24T17:34:58.455 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36610.log.gz 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68783.log 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75815.log: /var/log/ceph/ceph-client.admin.30434.log: 0.0%gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.71032.log 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30434.log.gz 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.75815.log.gz 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68783.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41886.log 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.68783.log.gz 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26610.log 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71032.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71032.log.gz 2026-03-24T17:34:58.456 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82494.log 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41886.log: /var/log/ceph/ceph-client.admin.26610.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34257.log 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26610.log.gz 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.41886.log.gz 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82494.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44823.log 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82494.log.gz 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61965.log 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34257.log.gz 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44823.log.gz 2026-03-24T17:34:58.457 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74845.log 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61965.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69507.log 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61965.log.gz 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74845.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83853.log 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74845.log.gz 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69507.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32944.log 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69507.log.gz 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22785.log 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83853.log.gz 2026-03-24T17:34:58.458 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77272.log 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32944.log: /var/log/ceph/ceph-client.admin.22785.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40007.log 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22785.log.gz 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32944.log.gz 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77272.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41192.log 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77272.log.gz 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75060.log 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40007.log.gz 2026-03-24T17:34:58.459 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64944.log 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41192.log.gz 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75060.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49965.log 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75060.log.gz 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64944.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74282.log 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64944.log.gz 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69421.log 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49965.log.gz 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74282.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.74282.log.gz -5 2026-03-24T17:34:58.460 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.84089.log 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69421.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49298.log 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69421.log.gz 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84089.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79061.log 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84089.log.gz 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49298.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53096.log 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49298.log.gz 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78180.log 2026-03-24T17:34:58.461 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79061.log.gz 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53096.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36047.log 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53096.log.gz 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90358.log 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78180.log.gz 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38373.log 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36047.log: /var/log/ceph/ceph-client.admin.90358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36047.log.gz 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90358.log.gz 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46696.log 2026-03-24T17:34:58.462 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38373.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68619.log 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46696.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53660.log 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46696.log.gz 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38373.log.gz 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68619.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25647.log 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68619.log.gz 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45943.log 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53660.log.gz 2026-03-24T17:34:58.463 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25647.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64710.log 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25647.log.gz 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45943.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37294.log 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45943.log.gz 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64710.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70041.log 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.64710.log.gz 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54935.log 2026-03-24T17:34:58.464 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37294.log: /var/log/ceph/ceph-client.admin.70041.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42837.log 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37294.log.gz 58.8% -- replaced with /var/log/ceph/ceph-client.admin.70041.log.gz 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54935.log.gz 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70709.log 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42837.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68888.log 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42837.log.gz 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70709.log.gz 2026-03-24T17:34:58.465 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68028.log 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68888.log.gz 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56747.log 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36422.log 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68028.log.gz 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56747.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59821.log 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56747.log.gz 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86064.log 2026-03-24T17:34:58.466 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36422.log.gz 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40702.log 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59821.log.gz 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.86064.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.25849.log 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86064.log.gz 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40702.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73476.log 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40702.log.gz 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25849.log.gz 2026-03-24T17:34:58.467 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64574.log 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62183.log 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73476.log.gz 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64574.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89941.log 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64574.log.gz 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68985.log 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62183.log.gz 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89469.log 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89941.log.gz 2026-03-24T17:34:58.468 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53762.log 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68985.log.gz 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89469.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53681.log 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89469.log.gz 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31888.log 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53762.log.gz 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49212.log 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53681.log: /var/log/ceph/ceph-client.admin.31888.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54914.log 2026-03-24T17:34:58.469 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53681.log.gz 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49212.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.49212.log.gz -5 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.49879.log 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31888.log.gz 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54914.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58556.log 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54914.log.gz 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33977.log 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49879.log.gz 2026-03-24T17:34:58.470 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59678.log 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58556.log.gz 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33977.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69184.log 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33977.log.gz 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59678.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67684.log 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59678.log.gz 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66496.log 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69184.log.gz 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30014.log 2026-03-24T17:34:58.471 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67684.log.gz 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66496.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89211.log 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66496.log.gz 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30014.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36593.log 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30014.log.gz 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57625.log 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89211.log.gz 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39171.log 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36593.log.gz/var/log/ceph/ceph-client.admin.57625.log: 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79023.log 2026-03-24T17:34:58.472 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57625.log.gz 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39171.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41020.log 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.39171.log.gz 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65319.log 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79023.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79023.log.gz 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80533.log 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41020.log.gz/var/log/ceph/ceph-client.admin.65319.log: 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38457.log 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.65319.log.gz 2026-03-24T17:34:58.473 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80533.log.gz 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55818.log 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38457.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82558.log 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38457.log.gz 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55818.log.gz 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34237.log 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82558.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70088.log 2026-03-24T17:34:58.474 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82558.log.gz 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34237.log.gz 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82256.log 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35502.log 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70088.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70088.log.gz 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82256.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62668.log 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr: 87.4% -- replaced with /var/log/ceph/ceph-client.admin.82256.log.gz 2026-03-24T17:34:58.475 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35502.log.gz 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29004.log 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62668.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26438.log 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62668.log.gz 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85026.log 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29004.log.gz/var/log/ceph/ceph-client.admin.26438.log: 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36525.log 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26438.log.gz 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85026.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32569.log 2026-03-24T17:34:58.476 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85026.log.gz 2026-03-24T17:34:58.477 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63851.log 2026-03-24T17:34:58.477 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36525.log: /var/log/ceph/ceph-client.admin.32569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36525.log.gz 2026-03-24T17:34:58.477 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79735.log 2026-03-24T17:34:58.477 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32569.log.gz 2026-03-24T17:34:58.477 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72819.log 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63851.log: /var/log/ceph/ceph-client.admin.79735.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40681.log 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79735.log.gz 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.63851.log.gz 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72819.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61144.log 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72819.log.gz 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69571.log 2026-03-24T17:34:58.478 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40681.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89232.log 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61144.log: 24.9% -- replaced with /var/log/ceph/ceph-client.admin.40681.log.gz 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.69571.log: -- replaced with /var/log/ceph/ceph-client.admin.61144.log.gz 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69571.log.gz 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71772.log 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70623.log 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89232.log.gz 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71772.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41235.log 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.71772.log.gz 2026-03-24T17:34:58.479 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70623.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74508.log 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70623.log.gz 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.41235.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.63811.log 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41235.log.gz 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34177.log 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74508.log.gz 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40870.log 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63811.log.gz 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34177.log.gz 2026-03-24T17:34:58.480 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32876.log 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40870.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41214.log 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40870.log.gz 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32876.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45449.log 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32876.log.gz 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88523.log 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41214.log.gz 2026-03-24T17:34:58.481 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45449.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64458.log 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45449.log.gz 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64298.log 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88523.log.gz 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74888.log 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64458.log: /var/log/ceph/ceph-client.admin.64298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64298.log.gz 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr: 56.2% -- replaced with /var/log/ceph/ceph-client.admin.64458.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76842.log 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.482 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33697.log 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74888.log.gz 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76842.log.gz 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63792.log 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33697.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44344.log 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33697.log.gz 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63792.log.gz 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69206.log 2026-03-24T17:34:58.483 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65737.log 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44344.log.gz 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69206.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69550.log 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69206.log.gz 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42069.log 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65737.log.gz/var/log/ceph/ceph-client.admin.69550.log: 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69550.log.gz 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66867.log 2026-03-24T17:34:58.484 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35434.log 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42069.log.gz 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66867.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52795.log 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.66867.log.gz 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35434.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44045.log 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52795.log.gz 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53521.log 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35434.log.gz 2026-03-24T17:34:58.485 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74087.log 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44045.log.gz 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48847.log 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53521.log.gz 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74087.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69636.log 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74087.log.gz 2026-03-24T17:34:58.486 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72870.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48847.log: /var/log/ceph/ceph-client.admin.69636.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68222.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr: 54.8% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69636.log.gz 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.48847.log.gz 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84854.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72870.log.gz 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90067.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68222.log.gz 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.84854.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.65651.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84854.log.gz 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90067.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73996.log 2026-03-24T17:34:58.487 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90067.log.gz 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88759.log 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65651.log.gz 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84186.log 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73996.log.gz 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80511.log 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.88759.log.gz 2026-03-24T17:34:58.488 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34777.log 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84186.log.gz 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57112.log 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80511.log.gz 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78091.log 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34777.log.gz 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78126.log 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57112.log.gz 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78091.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30963.log 2026-03-24T17:34:58.489 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78091.log.gz 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50790.log 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78126.log.gz 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27141.log 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30963.log.gz 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50790.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63710.log 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50790.log.gz 2026-03-24T17:34:58.490 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27141.log.gzgzip 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.44465.log 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63710.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.63710.log.gz 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.33757.log 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44465.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88326.log 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33757.log: 0.0% 41.3% -- replaced with /var/log/ceph/ceph-client.admin.33757.log.gz 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44465.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46094.log 2026-03-24T17:34:58.491 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88326.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29691.log 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88326.log.gz 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46094.log.gz 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78633.log 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78316.log 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29691.log.gz/var/log/ceph/ceph-client.admin.78633.log: 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78633.log.gz 2026-03-24T17:34:58.492 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35911.log 2026-03-24T17:34:58.493 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78316.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84789.log 2026-03-24T17:34:58.493 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78316.log.gz 2026-03-24T17:34:58.493 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35911.log.gz 2026-03-24T17:34:58.493 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78431.log 2026-03-24T17:34:58.493 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84789.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86159.log 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84789.log.gz 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78431.log.gz 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85090.log 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71010.log 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86159.log.gz 2026-03-24T17:34:58.494 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25719.log 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85090.log: 0.0%/var/log/ceph/ceph-client.admin.71010.log: -- replaced with /var/log/ceph/ceph-client.admin.85090.log.gz 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71010.log.gz 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27445.log 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25719.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32467.log 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25719.log.gz 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27445.log.gz 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85609.log 2026-03-24T17:34:58.495 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32467.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36201.log 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59882.log 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85609.log.gz 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.32467.log.gz 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36201.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36201.log.gz 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.85759.log 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63608.log 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59882.log.gz 2026-03-24T17:34:58.496 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79197.log 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85759.log: /var/log/ceph/ceph-client.admin.63608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63608.log.gz 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85759.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75124.log 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose --/var/log/ceph/ceph-client.admin.79197.log: /var/log/ceph/ceph-client.admin.38310.log 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79197.log.gz 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75124.log.gz 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44265.log 2026-03-24T17:34:58.497 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38310.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70580.log 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44265.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84132.log 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44265.log.gz 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38310.log.gz/var/log/ceph/ceph-client.admin.70580.log: 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.43939.log 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70580.log.gz 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69657.log 2026-03-24T17:34:58.498 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84132.log.gz 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43939.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64691.log 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.43939.log.gz 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69657.log.gz 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42629.log 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61381.log 2026-03-24T17:34:58.499 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64691.log.gz 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42629.log.gz 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27782.log 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81016.log 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61381.log.gz 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81383.log 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27782.log.gz 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.81016.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.76543.log 2026-03-24T17:34:58.500 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81016.log.gz 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81383.log.gz 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90557.log 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48783.log 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76543.log: /var/log/ceph/ceph-client.admin.90557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76543.log.gz 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90557.log.gz 2026-03-24T17:34:58.501 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74144.log 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48783.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85501.log 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48783.log.gz 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74144.log.gz 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79584.log 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85501.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64358.log 2026-03-24T17:34:58.502 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85501.log.gz 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79584.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89619.log 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79584.log.gz 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64358.log.gz 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28589.log 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89619.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72958.log 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89619.log.gz 2026-03-24T17:34:58.503 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28589.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29476.log 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr: 3.2% -- replaced with /var/log/ceph/ceph-client.admin.28589.log.gz 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72958.log.gz 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49707.log 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.29476.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.54359.log 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29476.log.gz 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49707.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49707.log.gz 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41407.log 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84488.log/var/log/ceph/ceph-client.admin.54359.log: 2026-03-24T17:34:58.504 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54359.log.gz 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41407.log.gz 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80081.log 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84488.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44758.log 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47371.log 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80081.log.gz 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84488.log.gz 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44758.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44758.log.gz 2026-03-24T17:34:58.505 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32160.log 2026-03-24T17:34:58.506 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73627.log 2026-03-24T17:34:58.506 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47371.log.gz 2026-03-24T17:34:58.506 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32160.log: /var/log/ceph/ceph-client.admin.73627.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32160.log.gz 2026-03-24T17:34:58.506 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81155.log 2026-03-24T17:34:58.506 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73627.log.gz 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57951.log 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81155.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67319.log 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81155.log.gz 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57951.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55043.log 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57951.log.gz 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67319.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75750.log 2026-03-24T17:34:58.507 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67319.log.gz 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55043.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43625.log 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55043.log.gz 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64844.log 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75750.log.gz 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47048.log 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64844.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43625.log.gz 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64844.log.gz 2026-03-24T17:34:58.508 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32740.log 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47048.log.gz 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82644.log 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32927.log 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32740.log: /var/log/ceph/ceph-client.admin.82644.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60800.log 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82644.log.gz 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32740.log.gz 2026-03-24T17:34:58.509 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69065.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32927.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80318.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60800.log: /var/log/ceph/ceph-client.admin.69065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60800.log.gz 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32927.log.gz 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69065.log.gz 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65716.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80318.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81598.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80318.log.gz 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76180.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65716.log.gz 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81598.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30855.log 2026-03-24T17:34:58.510 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81598.log.gz 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57279.log 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76180.log.gz 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30855.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73080.log 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30855.log.gz 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57279.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.57279.log.gz -5 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.72273.log 2026-03-24T17:34:58.511 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73080.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65501.log 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73080.log.gz 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72273.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64008.log 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72273.log.gz 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65501.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88996.log 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65501.log.gz 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44386.log 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64008.log.gz 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88996.log.gz 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71661.log 2026-03-24T17:34:58.512 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44386.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70902.log 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44386.log.gz 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71661.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33577.log 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71661.log.gz 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86902.log 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70902.log.gz 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66455.log 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33577.log.gz 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81835.log 2026-03-24T17:34:58.513 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86902.log.gz 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66455.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78757.log 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66455.log.gz 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32637.log 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81835.log.gz 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78757.log.gz 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43881.log 2026-03-24T17:34:58.514 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32637.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46876.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43881.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36166.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43881.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32637.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46876.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43044.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46876.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27020.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36166.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78704.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43044.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27020.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26459.log 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27020.log.gz 2026-03-24T17:34:58.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30379.log 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78704.log.gz 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49836.log 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26459.log.gz 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26115.log 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30379.log.gz/var/log/ceph/ceph-client.admin.49836.log: 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75403.log 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49836.log.gz 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26115.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76581.log 2026-03-24T17:34:58.516 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26115.log.gz 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75449.log 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75403.log.gz 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47328.log 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76581.log.gz 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75449.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86923.log 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75449.log.gz 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47328.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32314.log 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47328.log.gz 2026-03-24T17:34:58.517 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38625.log 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86923.log.gz 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90115.log 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32314.log: /var/log/ceph/ceph-client.admin.38625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34497.log 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32314.log.gz 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90115.log: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38625.log.gz 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90115.log.gz 2026-03-24T17:34:58.518 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66210.log 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34497.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34497.log.gz -5 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.75793.log 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66210.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44186.log 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66210.log.gz 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50469.log 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75793.log.gz 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44186.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28651.log 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44186.log.gz 2026-03-24T17:34:58.519 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50469.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87138.log 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50469.log.gz 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80875.log 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28651.log.gz 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87138.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80490.log 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87138.log.gz 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38982.log 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80875.log.gz 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90713.log 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80490.log.gz 2026-03-24T17:34:58.520 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38982.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54261.log 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90713.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38982.log.gz 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90713.log.gz 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64864.log 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54261.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28918.log 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64864.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.54261.log.gz 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59118.log 2026-03-24T17:34:58.521 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64864.log.gz 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28918.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84725.log 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28918.log.gz 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35724.log 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59118.log.gz 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84725.log.gz 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79606.log 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51285.log 2026-03-24T17:34:58.522 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35724.log.gz 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79606.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90789.log 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79606.log.gz 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51285.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85695.log 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51285.log.gz 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61756.log 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90789.log.gz 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38667.log 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85695.log: 0.0%/var/log/ceph/ceph-client.admin.61756.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68905.log 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.85695.log.gz 2026-03-24T17:34:58.523 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61756.log.gz 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38667.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58298.log 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88695.log 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38667.log.gz 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68905.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68905.log.gz 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48954.log 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58298.log: 0.0%/var/log/ceph/ceph-client.admin.88695.log: -- replaced with /var/log/ceph/ceph-client.admin.58298.log.gz 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65823.log 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88695.log.gz 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48954.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85781.log 2026-03-24T17:34:58.524 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48954.log.gz 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57499.log 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65823.log.gz 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47668.log 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85781.log.gz/var/log/ceph/ceph-client.admin.57499.log: 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41986.log 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57499.log.gz 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47668.log: gzip -5 --verbose -- /var/log/ceph/ceph-mgr.x.log 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47668.log.gz 2026-03-24T17:34:58.525 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83031.log 2026-03-24T17:34:58.526 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41986.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68410.log 2026-03-24T17:34:58.526 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-mgr.x.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.41986.log.gz 2026-03-24T17:34:58.526 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83031.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39599.log 2026-03-24T17:34:58.526 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83031.log.gz 2026-03-24T17:34:58.529 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68410.log.gz 2026-03-24T17:34:58.529 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66576.log 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39599.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.39599.log.gz 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.80382.log 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66576.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64181.log 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.66576.log.gz 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56257.log 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64181.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32229.log 2026-03-24T17:34:58.530 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64181.log.gz 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80382.log: /var/log/ceph/ceph-client.admin.56257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80382.log.gz 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56257.log.gz 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26287.log 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90271.log 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32229.log: /var/log/ceph/ceph-client.admin.26287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26287.log.gz 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43584.log 2026-03-24T17:34:58.531 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90271.log.gz 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32229.log.gz 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89318.log 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47392.log 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43584.log: /var/log/ceph/ceph-client.admin.89318.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29241.log 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89318.log.gz 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43584.log.gz 2026-03-24T17:34:58.532 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47392.log.gz 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31156.log 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32280.log 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29241.log: /var/log/ceph/ceph-client.admin.31156.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43303.log 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31156.log.gz 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29241.log.gz 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77978.log 2026-03-24T17:34:58.533 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32280.log.gz 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43303.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63121.log 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43303.log.gz 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86262.log 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77978.log: /var/log/ceph/ceph-client.admin.63121.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26904.log 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77978.log.gz 57.9% 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.63121.log.gz 2026-03-24T17:34:58.534 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86262.log.gz 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25797.log 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53438.log 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26904.log: /var/log/ceph/ceph-client.admin.25797.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35962.log 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25797.log.gz 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26904.log.gz 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53438.log.gz 2026-03-24T17:34:58.535 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36882.log 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82623.log 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35962.log: /var/log/ceph/ceph-client.admin.36882.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48510.log 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36882.log.gz 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35962.log.gz 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82623.log.gz 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88716.log 2026-03-24T17:34:58.536 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52709.log 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48510.log: /var/log/ceph/ceph-client.admin.88716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77612.log 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88716.log.gz 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48510.log.gz 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52709.log.gz 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50962.log 2026-03-24T17:34:58.537 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48243.log 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77612.log: /var/log/ceph/ceph-client.admin.50962.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65458.log 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50962.log.gz 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77612.log.gz 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48243.log.gz 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52752.log 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48632.log 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65458.log: /var/log/ceph/ceph-client.admin.52752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59901.log 2026-03-24T17:34:58.538 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52752.log.gz 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65458.log.gz 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48632.log.gz 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68114.log 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54785.log 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59901.log: /var/log/ceph/ceph-client.admin.68114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60908.log 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68114.log.gz 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59901.log.gz/var/log/ceph/ceph-client.admin.54785.log: 2026-03-24T17:34:58.539 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54785.log.gz 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57362.log 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40913.log 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60908.log: /var/log/ceph/ceph-client.admin.57362.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28875.log 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57362.log.gz 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60908.log.gz 2026-03-24T17:34:58.540 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40913.log.gz 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41907.log 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39943.log 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28875.log: /var/log/ceph/ceph-client.admin.41907.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72659.log 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28875.log.gz 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr: 26.5%/var/log/ceph/ceph-client.admin.39943.log: -- replaced with /var/log/ceph/ceph-client.admin.41907.log.gz 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39943.log.gz 2026-03-24T17:34:58.541 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58819.log 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59532.log 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72659.log.gz 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58819.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44909.log 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58819.log.gz 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66910.log 2026-03-24T17:34:58.542 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59532.log: /var/log/ceph/ceph-client.admin.44909.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44909.log.gz 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71090.log 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.59532.log.gz 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66910.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70494.log 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.66910.log.gz 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71090.log.gz 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57439.log 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70494.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69829.log 2026-03-24T17:34:58.543 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70494.log.gz 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57439.log.gz 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46855.log 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39427.log 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69829.log: /var/log/ceph/ceph-client.admin.46855.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46855.log.gz 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.69829.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53247.log 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49341.log 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39427.log.gz 2026-03-24T17:34:58.544 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53247.log.gz 2026-03-24T17:34:58.545 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71754.log 2026-03-24T17:34:58.545 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66103.log 2026-03-24T17:34:58.545 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49341.log: /var/log/ceph/ceph-client.admin.71754.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71754.log.gz 2026-03-24T17:34:58.545 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60886.log 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.49341.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51796.log 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66103.log: /var/log/ceph/ceph-client.admin.60886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60886.log.gz 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66103.log.gz 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64750.log 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38583.log 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51796.log: /var/log/ceph/ceph-client.admin.64750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64750.log.gz 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50704.log 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51796.log.gz 2026-03-24T17:34:58.546 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38583.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80017.log 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28983.log 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50704.log.gz 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38583.log.gz 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72595.log 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80017.log.gz 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51263.log 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72595.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37231.log 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72595.log.gz 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28983.log.gz 2026-03-24T17:34:58.547 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51263.log.gz 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50575.log 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37231.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73377.log 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.37231.log.gz 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50575.log.gz 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39706.log 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73377.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71732.log 2026-03-24T17:34:58.548 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73377.log.gz 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39706.log.gz 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28106.log 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31752.log 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71732.log: /var/log/ceph/ceph-client.admin.28106.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50180.log 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28106.log.gz 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71732.log.gz 2026-03-24T17:34:58.549 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31752.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31752.log.gz 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81534.log 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73357.log 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50180.log.gz 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81534.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59178.log 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81534.log.gz 2026-03-24T17:34:58.550 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31684.log 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59178.log.gz 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63081.log 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73357.log: /var/log/ceph/ceph-client.admin.31684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88888.log 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63081.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31684.log.gz 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63081.log.gz 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73357.log.gz 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82049.log 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88888.log.gz 2026-03-24T17:34:58.551 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77909.log 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82049.log.gz 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80554.log 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77909.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.77909.log.gz --verbose 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr: -- /var/log/ceph/ceph-client.admin.60112.log 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80554.log.gz 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52817.log 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60112.log: gzip -5 --verbose 0.0% -- -- replaced with /var/log/ceph/ceph-client.admin.60112.log.gz /var/log/ceph/ceph-client.admin.35979.log 2026-03-24T17:34:58.552 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52817.log.gz 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41825.log 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31264.log 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87377.log 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31264.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41825.log.gz -- replaced with /var/log/ceph/ceph-client.admin.31264.log.gz 2026-03-24T17:34:58.553 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.554 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35979.log.gz 2026-03-24T17:34:58.554 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55947.log 2026-03-24T17:34:58.554 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38730.log 2026-03-24T17:34:58.554 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55947.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48388.log 2026-03-24T17:34:58.554 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55947.log.gz 2026-03-24T17:34:58.555 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87377.log: /var/log/ceph/ceph-client.admin.38730.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87377.log.gz 2026-03-24T17:34:58.555 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.38730.log.gz 2026-03-24T17:34:58.555 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70129.log 2026-03-24T17:34:58.555 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57706.log 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70129.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28749.log 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70129.log.gz 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48388.log: /var/log/ceph/ceph-client.admin.57706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57706.log.gz 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48388.log.gz 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60865.log 2026-03-24T17:34:58.556 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43176.log 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60865.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27481.log 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60865.log.gz 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28749.log.gz 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32024.log 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.43176.log.gz 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27481.log.gz 2026-03-24T17:34:58.557 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70816.log 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85888.log 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32024.log.gz 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70816.log.gz 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49535.log 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85888.log.gz 2026-03-24T17:34:58.558 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30315.log 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49535.log.gz 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84149.log 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30315.log.gz 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69270.log 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84149.log: gzip -5 0.0% --verbose -- replaced with /var/log/ceph/ceph-client.admin.84149.log.gz -- 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr: /var/log/ceph/ceph-client.admin.65995.log 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69270.log.gz 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33354.log 2026-03-24T17:34:58.559 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65995.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54485.log 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65995.log.gz 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33354.log.gz 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45793.log 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84983.log 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64634.log 2026-03-24T17:34:58.560 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45793.log.gz 2026-03-24T17:34:58.561 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54485.log: /var/log/ceph/ceph-client.admin.84983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84983.log.gz 2026-03-24T17:34:58.561 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54485.log.gz 2026-03-24T17:34:58.561 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45016.log 2026-03-24T17:34:58.561 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52474.log 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64634.log: /var/log/ceph/ceph-client.admin.45016.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62998.log 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45016.log.gz 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64634.log.gz 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.52474.log.gz 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65104.log 2026-03-24T17:34:58.562 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28552.log 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65104.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77791.log 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.65104.log.gz 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62998.log: /var/log/ceph/ceph-client.admin.28552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28552.log.gz 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62998.log.gz 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39405.log 2026-03-24T17:34:58.563 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32978.log 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39405.log.gz 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35299.log 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77791.log: /var/log/ceph/ceph-client.admin.32978.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62724.log 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35299.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32978.log.gz 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35299.log.gz 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.77791.log.gz 2026-03-24T17:34:58.564 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48117.log 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62724.log.gz 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38331.log 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48117.log.gz 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27997.log 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38331.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80662.log 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38331.log.gz 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27997.log.gz 2026-03-24T17:34:58.565 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45406.log 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80662.log.gz 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71573.log 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45406.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.45406.log.gz 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.35019.log 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71573.log.gz 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44244.log 2026-03-24T17:34:58.566 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35019.log.gz 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55172.log 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59411.log 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55172.log.gz 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72251.log 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44244.log: /var/log/ceph/ceph-client.admin.59411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59411.log.gz 2026-03-24T17:34:58.567 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46833.log 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72251.log.gz 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84596.log 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46833.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.44244.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46833.log.gz 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47156.log 2026-03-24T17:34:58.568 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85630.log 2026-03-24T17:34:58.569 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47156.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85219.log 0.0% 2026-03-24T17:34:58.569 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.47156.log.gz 2026-03-24T17:34:58.569 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84596.log: /var/log/ceph/ceph-client.admin.85630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85630.log.gz 2026-03-24T17:34:58.569 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84596.log.gz 2026-03-24T17:34:58.569 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57930.log 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27678.log 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77722.log 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.57930.log.gz 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85219.log: /var/log/ceph/ceph-client.admin.27678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27678.log.gz 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85219.log.gz 2026-03-24T17:34:58.570 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54828.log 2026-03-24T17:34:58.571 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40364.log 2026-03-24T17:34:58.571 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54828.log: gzip -5 --verbose 0.0% -- -- replaced with /var/log/ceph/ceph-client.admin.54828.log.gz /var/log/ceph/ceph-client.admin.48912.log 2026-03-24T17:34:58.571 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.571 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77722.log: /var/log/ceph/ceph-client.admin.40364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40364.log.gz 2026-03-24T17:34:58.571 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77722.log.gz 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84183.log 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30761.log 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28019.log 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr: 91.2% -- replaced with /var/log/ceph/ceph-client.admin.84183.log.gz 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48912.log: /var/log/ceph/ceph-client.admin.30761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48912.log.gz 2026-03-24T17:34:58.572 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30761.log.gz 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34117.log 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74164.log 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34117.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41063.log 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34117.log.gz 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28019.log: /var/log/ceph/ceph-client.admin.74164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28019.log.gz 29.7% -- replaced with /var/log/ceph/ceph-client.admin.74164.log.gz 2026-03-24T17:34:58.573 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.574 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66536.log 2026-03-24T17:34:58.574 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28610.log 2026-03-24T17:34:58.574 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66536.log.gz 2026-03-24T17:34:58.574 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70993.log 2026-03-24T17:34:58.574 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41063.log: /var/log/ceph/ceph-client.admin.28610.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80854.log 2026-03-24T17:34:58.575 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28610.log.gz 2026-03-24T17:34:58.575 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.70993.log: -- replaced with /var/log/ceph/ceph-client.admin.41063.log.gz 2026-03-24T17:34:58.575 INFO:teuthology.orchestra.run.vm01.stderr: 11.8% -- replaced with /var/log/ceph/ceph-client.admin.70993.log.gz 2026-03-24T17:34:58.575 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51199.log 2026-03-24T17:34:58.575 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83703.log 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51199.log.gz 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36916.log 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80854.log: /var/log/ceph/ceph-client.admin.83703.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65125.log 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83703.log.gz 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36916.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80854.log.gz 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36916.log.gz 2026-03-24T17:34:58.576 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78410.log 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58880.log 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78410.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80833.log 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78410.log.gz 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58880.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58880.log.gz 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88394.log 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65125.log.gz 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80833.log.gz 2026-03-24T17:34:58.577 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50512.log 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88394.log.gz 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89254.log 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50512.log.gz 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62544.log 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89254.log.gz 2026-03-24T17:34:58.578 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85155.log 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62544.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62544.log.gz -5 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.30078.log 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85155.log.gz 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28428.log 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75318.log 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28428.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56661.log 2026-03-24T17:34:58.579 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28428.log.gz 2026-03-24T17:34:58.580 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30078.log: /var/log/ceph/ceph-client.admin.75318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75318.log.gz 2026-03-24T17:34:58.580 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30078.log.gz 2026-03-24T17:34:58.580 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64827.log 2026-03-24T17:34:58.580 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28170.log 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64827.log.gz 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82988.log 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56661.log: /var/log/ceph/ceph-client.admin.28170.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.28170.log.gz 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.33437.log 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82988.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82988.log.gz 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36354.log 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.56661.log.gz 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33437.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43381.log 0.0% 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.33437.log.gz 2026-03-24T17:34:58.581 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36354.log.gz 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30035.log 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58766.log 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30035.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35139.log 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30035.log.gz 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43381.log: /var/log/ceph/ceph-client.admin.58766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43381.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58766.log.gz 2026-03-24T17:34:58.582 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76297.log 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66618.log 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35139.log: /var/log/ceph/ceph-client.admin.76297.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30186.log 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76297.log.gz 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35139.log.gz 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66618.log.gz 2026-03-24T17:34:58.583 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54136.log 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71134.log 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30186.log: /var/log/ceph/ceph-client.admin.54136.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39556.log 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54136.log.gz 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30186.log.gz 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71134.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.71134.log.gz 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68552.log 2026-03-24T17:34:58.584 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61273.log 2026-03-24T17:34:58.585 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39556.log: /var/log/ceph/ceph-client.admin.68552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83681.log 2026-03-24T17:34:58.585 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68552.log.gz 2026-03-24T17:34:58.585 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39556.log.gz 2026-03-24T17:34:58.585 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61273.log.gz 2026-03-24T17:34:58.585 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66413.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83681.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27121.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83681.log.gz 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58449.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66413.log.gz 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27121.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41845.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27121.log.gz 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33064.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41845.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36474.log 2026-03-24T17:34:58.586 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41845.log.gz 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58449.log: /var/log/ceph/ceph-client.admin.33064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58449.log.gz 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33064.log.gz 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67534.log 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50260.log 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36474.log: /var/log/ceph/ceph-client.admin.67534.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46499.log 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67534.log.gz 2026-03-24T17:34:58.587 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36474.log.gz 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50260.log.gz 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65206.log 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72905.log 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46499.log: /var/log/ceph/ceph-client.admin.65206.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63668.log 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65206.log.gz 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46499.log.gz 2026-03-24T17:34:58.588 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72905.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72905.log.gz 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61445.log 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62024.log 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63668.log: /var/log/ceph/ceph-client.admin.61445.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47199.log 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61445.log.gz 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63668.log.gz 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62024.log.gz 2026-03-24T17:34:58.589 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41171.log 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89103.log 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47199.log: /var/log/ceph/ceph-client.admin.41171.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58938.log 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41171.log.gz 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47199.log.gz 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89103.log.gz 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85112.log 2026-03-24T17:34:58.590 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39900.log 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58938.log: /var/log/ceph/ceph-client.admin.85112.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53845.log 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85112.log.gz 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58938.log.gz 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39900.log.gz 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60413.log 2026-03-24T17:34:58.591 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85351.log 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53845.log: /var/log/ceph/ceph-client.admin.60413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59922.log 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60413.log.gz 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53845.log.gz 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85351.log.gz 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32177.log 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82859.log 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59922.log: /var/log/ceph/ceph-client.admin.32177.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47005.log 2026-03-24T17:34:58.592 INFO:teuthology.orchestra.run.vm01.stderr: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.59922.log.gz 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82859.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32177.log.gz 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82859.log.gz 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54806.log 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47005.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36405.log 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47005.log.gz 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54806.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53618.log 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.54806.log.gz 2026-03-24T17:34:58.593 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77036.log 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36405.log: /var/log/ceph/ceph-client.admin.53618.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35159.log 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53618.log.gz 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36405.log.gz 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77036.log.gz 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57069.log 2026-03-24T17:34:58.594 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50769.log 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35159.log: /var/log/ceph/ceph-client.admin.57069.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57976.log 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57069.log.gz 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35159.log.gz 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50769.log.gz 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72208.log 2026-03-24T17:34:58.595 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26244.log 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72208.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90401.log 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72208.log.gz 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57976.log: /var/log/ceph/ceph-client.admin.26244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26244.log.gz 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57976.log.gz 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56427.log 2026-03-24T17:34:58.596 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27273.log 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56427.log.gz 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34617.log 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90401.log: /var/log/ceph/ceph-client.admin.27273.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56006.log 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27273.log.gz 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.34617.log: -- replaced with /var/log/ceph/ceph-client.admin.90401.log.gz 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34617.log.gz 2026-03-24T17:34:58.597 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73868.log 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54677.log 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73868.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79498.log 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73868.log.gz 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56006.log: /var/log/ceph/ceph-client.admin.54677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54677.log.gz 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56006.log.gz 2026-03-24T17:34:58.598 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49319.log 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85716.log 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79498.log: /var/log/ceph/ceph-client.admin.49319.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78876.log 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49319.log.gz 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79498.log.gz 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85716.log.gz 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56048.log 2026-03-24T17:34:58.599 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34819.log 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78876.log: /var/log/ceph/ceph-client.admin.56048.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42796.log 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56048.log.gz 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78876.log.gz 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34819.log.gz 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83010.log 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48056.log 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42796.log: /var/log/ceph/ceph-client.admin.83010.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60134.log 2026-03-24T17:34:58.600 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83010.log.gz 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48056.log: 53.5% -- replaced with /var/log/ceph/ceph-client.admin.42796.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48056.log.gz 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59658.log 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38037.log 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59658.log.gz 2026-03-24T17:34:58.601 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60134.log.gz 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74595.log 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38037.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78985.log 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.38037.log.gz 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74595.log.gz 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32552.log 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78985.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82752.log 2026-03-24T17:34:58.602 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78985.log.gz 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89924.log 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32552.log.gz 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35079.log 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82752.log: /var/log/ceph/ceph-client.admin.89924.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38205.log 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89924.log.gz 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82752.log.gz 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35079.log.gz 2026-03-24T17:34:58.603 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85329.log 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78335.log 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38205.log: /var/log/ceph/ceph-client.admin.85329.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25467.log 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85329.log.gz 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38205.log.gz 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78335.log.gz 2026-03-24T17:34:58.604 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73733.log 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50282.log 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25467.log: /var/log/ceph/ceph-client.admin.73733.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66556.log 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73733.log.gz 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25467.log.gz/var/log/ceph/ceph-client.admin.50282.log: 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50282.log.gz 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73397.log 2026-03-24T17:34:58.605 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35400.log 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66556.log: /var/log/ceph/ceph-client.admin.73397.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90445.log 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73397.log.gz 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66556.log.gz 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35400.log.gz 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69894.log 2026-03-24T17:34:58.606 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79305.log 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90445.log: /var/log/ceph/ceph-client.admin.69894.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54012.log 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69894.log.gz 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90445.log.gz 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79305.log.gz 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50661.log 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42670.log 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54012.log: /var/log/ceph/ceph-client.admin.50661.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39921.log 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50661.log.gz 2026-03-24T17:34:58.607 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54012.log.gz 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42670.log.gz 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53948.log 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85996.log 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39921.log: /var/log/ceph/ceph-client.admin.53948.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42609.log 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53948.log.gz 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39921.log.gz 2026-03-24T17:34:58.608 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85996.log.gz 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58919.log 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43683.log 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42609.log: /var/log/ceph/ceph-client.admin.58919.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67190.log 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58919.log.gz 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42609.log.gz 2026-03-24T17:34:58.609 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43683.log.gz 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61948.log 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89598.log 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61948.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54763.log 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61948.log.gz 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.89598.log: -- replaced with /var/log/ceph/ceph-client.admin.67190.log.gz 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89598.log.gz 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55323.log 2026-03-24T17:34:58.610 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54763.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75965.log 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.54763.log.gz 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55323.log.gz 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32654.log 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48975.log 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75965.log: /var/log/ceph/ceph-client.admin.32654.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32603.log 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32654.log.gz 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75965.log.gz 2026-03-24T17:34:58.611 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48975.log: 56.1% -- replaced with /var/log/ceph/ceph-client.admin.48975.log.gz 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46812.log 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38121.log 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32603.log: /var/log/ceph/ceph-client.admin.46812.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68945.log 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46812.log.gz 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32603.log.gz 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38121.log: 26.8%gzip -- replaced with /var/log/ceph/ceph-client.admin.38121.log.gz 2026-03-24T17:34:58.612 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.51070.log 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78842.log 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68945.log: /var/log/ceph/ceph-client.admin.51070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69025.log 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51070.log.gz 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68945.log.gz 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78842.log.gz 2026-03-24T17:34:58.613 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42549.log 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42735.log 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69025.log: /var/log/ceph/ceph-client.admin.42549.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84553.log 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42549.log.gz 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42735.log: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.69025.log.gz 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42735.log.gz 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37273.log 2026-03-24T17:34:58.614 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58105.log 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84553.log: /var/log/ceph/ceph-client.admin.37273.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35792.log 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84553.log.gz 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37273.log.gz 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58105.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58105.log.gz 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78021.log 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35792.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86429.log 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35792.log.gz 2026-03-24T17:34:58.615 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78021.log.gz 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89447.log 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29863.log 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86429.log: /var/log/ceph/ceph-client.admin.89447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59778.log 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89447.log.gz 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86429.log.gz 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29863.log.gz 2026-03-24T17:34:58.616 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82967.log 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41149.log 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59778.log: /var/log/ceph/ceph-client.admin.82967.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76116.log 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82967.log.gz 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59778.log.gz 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41149.log.gz 2026-03-24T17:34:58.617 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73887.log 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63528.log 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76116.log: /var/log/ceph/ceph-client.admin.73887.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89275.log 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73887.log.gz 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76116.log.gz 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63528.log.gz 2026-03-24T17:34:58.618 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51156.log 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76324.log 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89275.log: /var/log/ceph/ceph-client.admin.51156.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43761.log 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51156.log.gz 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89275.log.gz 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76324.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76324.log.gz 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60263.log 2026-03-24T17:34:58.619 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33517.log 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43761.log: /var/log/ceph/ceph-client.admin.60263.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54978.log 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60263.log.gz 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33517.log.gz 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.43761.log.gz 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81770.log 2026-03-24T17:34:58.620 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43803.log 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54978.log: /var/log/ceph/ceph-client.admin.81770.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44887.log 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81770.log.gz 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54978.log.gz 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43803.log.gz 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25980.log 2026-03-24T17:34:58.621 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44887.log.gz 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29219.log 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25980.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63967.log 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25980.log.gz 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29219.log.gz 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62124.log 2026-03-24T17:34:58.622 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80597.log 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63967.log: /var/log/ceph/ceph-client.admin.62124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58126.log 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62124.log.gz 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63967.log.gz 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80597.log.gz 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52538.log 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55689.log 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58126.log: /var/log/ceph/ceph-client.admin.52538.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55021.log 2026-03-24T17:34:58.623 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52538.log.gz 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58126.log.gz 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55689.log.gz 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38814.log 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39108.log 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55021.log: /var/log/ceph/ceph-client.admin.38814.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46289.log 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38814.log.gz 2026-03-24T17:34:58.624 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55021.log.gz/var/log/ceph/ceph-client.admin.39108.log: 2026-03-24T17:34:58.625 INFO:teuthology.orchestra.run.vm01.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.39108.log.gz 2026-03-24T17:34:58.625 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27235.log 2026-03-24T17:34:58.625 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44995.log 2026-03-24T17:34:58.625 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46289.log: /var/log/ceph/ceph-client.admin.27235.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27235.log.gz 2026-03-24T17:34:58.625 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33081.log 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.44995.log: -- replaced with /var/log/ceph/ceph-client.admin.46289.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57537.log 0.0% 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44995.log.gz 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79563.log 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33081.log.gz 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75836.log 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57537.log: /var/log/ceph/ceph-client.admin.79563.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32195.log 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79563.log.gz 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57537.log.gz/var/log/ceph/ceph-client.admin.75836.log: 2026-03-24T17:34:58.626 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75836.log.gz 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75167.log 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76491.log 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32195.log: /var/log/ceph/ceph-client.admin.75167.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88587.log 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75167.log.gz 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76491.log.gz 2026-03-24T17:34:58.627 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32195.log.gz 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36508.log 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79412.log 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88587.log: /var/log/ceph/ceph-client.admin.36508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62315.log 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36508.log.gz 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88587.log.gz 2026-03-24T17:34:58.628 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79412.log.gz 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64105.log 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46051.log 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62315.log: /var/log/ceph/ceph-client.admin.64105.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38415.log 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64105.log.gz 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62315.log.gz 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46051.log.gz 2026-03-24T17:34:58.629 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67835.log 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79042.log 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38415.log: /var/log/ceph/ceph-client.admin.67835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74491.log 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67835.log.gz 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79042.log.gz 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38415.log.gz 2026-03-24T17:34:58.630 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52666.log 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35279.log 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74491.log: /var/log/ceph/ceph-client.admin.52666.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60478.log 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52666.log.gz 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74491.log.gz 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35279.log.gz 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56790.log 2026-03-24T17:34:58.631 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73592.log 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60478.log: /var/log/ceph/ceph-client.admin.56790.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.56790.log.gz -- 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr: /var/log/ceph/ceph-client.admin.26653.log 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60478.log.gz 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73592.log.gz 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26631.log 2026-03-24T17:34:58.632 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37378.log 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26653.log: /var/log/ceph/ceph-client.admin.26631.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37777.log 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26631.log.gz 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26653.log.gz 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37378.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82795.log 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.37378.log.gz 2026-03-24T17:34:58.633 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77633.log 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82795.log: /var/log/ceph/ceph-client.admin.37777.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89813.log 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82795.log.gz 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77633.log.gz 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53824.log 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr: 26.3% -- replaced with /var/log/ceph/ceph-client.admin.37777.log.gz 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89813.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37168.log 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89813.log.gz 2026-03-24T17:34:58.634 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53824.log.gz 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54892.log 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78295.log 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54892.log.gz 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63180.log 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37168.log: /var/log/ceph/ceph-client.admin.78295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78295.log.gz 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45274.log 2026-03-24T17:34:58.635 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57382.log 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63180.log.gz 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45274.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45274.log.gz 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48472.log 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37168.log.gz 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57382.log.gz 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37123.log 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48472.log.gz 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85069.log 2026-03-24T17:34:58.636 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37123.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90479.log 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.37123.log.gz 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85069.log.gz 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82816.log 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90479.log.gz 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38961.log 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82816.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31667.log 2026-03-24T17:34:58.637 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82816.log.gz 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38961.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40343.log 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38961.log.gz 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68804.log 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40343.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71355.log 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31667.log: /var/log/ceph/ceph-client.admin.68804.log: 65.0% -- replaced with /var/log/ceph/ceph-client.admin.40343.log.gz 2026-03-24T17:34:58.638 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68804.log.gz 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29562.log 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71355.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31667.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74823.log 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.71355.log.gz 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29562.log.gz 2026-03-24T17:34:58.639 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27577.log 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55538.log 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27577.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.66825.log -- replaced with /var/log/ceph/ceph-client.admin.27577.log.gz 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74823.log: /var/log/ceph/ceph-client.admin.55538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55538.log.gz 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74823.log.gz 2026-03-24T17:34:58.640 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35655.log 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29026.log 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35655.log.gz 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31457.log 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66825.log: /var/log/ceph/ceph-client.admin.29026.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29026.log.gz 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.65065.log 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.31457.log: -- replaced with /var/log/ceph/ceph-client.admin.66825.log.gz 2026-03-24T17:34:58.641 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.31457.log.gz 2026-03-24T17:34:58.642 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43782.log 2026-03-24T17:34:58.642 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60370.log 2026-03-24T17:34:58.642 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43782.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48826.log 2026-03-24T17:34:58.642 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43782.log.gz 2026-03-24T17:34:58.642 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65065.log: /var/log/ceph/ceph-client.admin.60370.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60370.log.gz 2026-03-24T17:34:58.643 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65065.log.gz 2026-03-24T17:34:58.643 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37609.log 2026-03-24T17:34:58.643 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90379.log 2026-03-24T17:34:58.643 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37609.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35339.log 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48826.log: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.37609.log.gz 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90379.log.gz 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48826.log.gz 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29112.log 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35339.log.gz 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41343.log 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29112.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.29112.log.gz --verbose 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr: -- /var/log/ceph/ceph-client.admin.59100.log 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41343.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41343.log.gz 2026-03-24T17:34:58.644 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66719.log 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59100.log.gz 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89383.log 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66719.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.66719.log.gz --verbose 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr: -- /var/log/ceph/ceph-client.admin.39620.log 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89383.log.gz 2026-03-24T17:34:58.645 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47568.log 2026-03-24T17:34:58.646 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79885.log 2026-03-24T17:34:58.646 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47568.log.gz 2026-03-24T17:34:58.646 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38520.log 2026-03-24T17:34:58.646 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39620.log: /var/log/ceph/ceph-client.admin.79885.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79885.log.gz 2026-03-24T17:34:58.646 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40205.log 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38520.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55925.log 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39620.log.gz 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38520.log.gz 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40205.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74651.log 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40205.log.gz 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55925.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55925.log.gz 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25649.log 2026-03-24T17:34:58.647 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28677.log 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25649.log: gzip -5 0.0% --verbose -- replaced with /var/log/ceph/ceph-client.admin.25649.log.gz -- 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr: /var/log/ceph/ceph-client.admin.62493.log 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74651.log: /var/log/ceph/ceph-client.admin.28677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28677.log.gz 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74651.log.gz 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27022.log 2026-03-24T17:34:58.648 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36695.log 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27022.log: gzip -5 --verbose 0.0% -- -- replaced with /var/log/ceph/ceph-client.admin.27022.log.gz /var/log/ceph/ceph-client.admin.88239.log 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62493.log: /var/log/ceph/ceph-client.admin.36695.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36695.log.gz 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.65379.log 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr: 0.0%/var/log/ceph/ceph-client.admin.88239.log: -- replaced with /var/log/ceph/ceph-client.admin.62493.log.gz 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88239.log.gz 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87417.log 2026-03-24T17:34:58.649 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65379.log.gz 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31735.log 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87417.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.87417.log.gz 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr: /var/log/ceph/ceph-client.admin.28278.log 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31735.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31735.log.gz 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29284.log 2026-03-24T17:34:58.650 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41128.log 2026-03-24T17:34:58.651 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29284.log.gz 2026-03-24T17:34:58.651 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81985.log 2026-03-24T17:34:58.651 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28278.log: /var/log/ceph/ceph-client.admin.41128.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41128.log.gz 2026-03-24T17:34:58.651 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.28278.log.gz 2026-03-24T17:34:58.651 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75922.log 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84184.log 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81985.log: /var/log/ceph/ceph-client.admin.75922.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75922.log.gz 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79799.log 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.81985.log.gz 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84184.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38499.log 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr: 92.0% -- replaced with /var/log/ceph/ceph-client.admin.84184.log.gz 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79799.log.gz 2026-03-24T17:34:58.652 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57646.log 2026-03-24T17:34:58.653 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66953.log 2026-03-24T17:34:58.653 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57646.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57646.log.gz -5 2026-03-24T17:34:58.653 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.27519.log 2026-03-24T17:34:58.653 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38499.log: /var/log/ceph/ceph-client.admin.66953.log: 58.9% -- replaced with /var/log/ceph/ceph-client.admin.66953.log.gz 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44423.log 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38499.log.gz 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27519.log.gz 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49621.log 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44423.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49126.log 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.44423.log.gz 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49621.log.gz 2026-03-24T17:34:58.654 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59431.log 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49126.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55560.log 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49126.log.gz 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59431.log.gz 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44024.log 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46158.log 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55560.log: /var/log/ceph/ceph-client.admin.44024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71495.log 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55560.log.gz 2026-03-24T17:34:58.655 INFO:teuthology.orchestra.run.vm01.stderr: 26.4%/var/log/ceph/ceph-client.admin.46158.log: -- replaced with /var/log/ceph/ceph-client.admin.44024.log.gz 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46158.log.gz 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27103.log 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45707.log 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71495.log.gz 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27103.log.gz 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27557.log 2026-03-24T17:34:58.656 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62474.log 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45707.log: /var/log/ceph/ceph-client.admin.27557.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33268.log 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27557.log.gz 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45707.log.gz 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62474.log.gz 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55989.log 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51005.log 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33268.log: /var/log/ceph/ceph-client.admin.55989.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.55989.log.gz --verbose 2026-03-24T17:34:58.657 INFO:teuthology.orchestra.run.vm01.stderr: -- /var/log/ceph/ceph-client.admin.47688.log 2026-03-24T17:34:58.658 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33268.log.gz 2026-03-24T17:34:58.658 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51005.log.gz 2026-03-24T17:34:58.658 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32399.log 2026-03-24T17:34:58.658 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63689.log 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47688.log: /var/log/ceph/ceph-client.admin.32399.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31307.log 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47688.log.gz 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32399.log.gz 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63689.log.gz 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46465.log 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31307.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22875.log 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31307.log.gz 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46465.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63548.log 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46465.log.gz 2026-03-24T17:34:58.659 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43860.log 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.22875.log: /var/log/ceph/ceph-client.admin.63548.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77100.log 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63548.log.gz 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22875.log.gz 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43860.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74952.log 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.43860.log.gz 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41300.log 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77100.log: /var/log/ceph/ceph-client.admin.74952.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74542.log 2026-03-24T17:34:58.660 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74952.log.gz 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77100.log.gz 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41300.log.gz 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42858.log 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81172.log 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74542.log.gz 2026-03-24T17:34:58.661 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42858.log.gz 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33657.log 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81172.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31607.log 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81172.log.gz 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33657.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31178.log 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33657.log.gz 2026-03-24T17:34:58.662 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43422.log 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31607.log: /var/log/ceph/ceph-client.admin.31178.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78721.log 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31178.log.gz 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31607.log.gz 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69464.log 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.43422.log.gz 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33817.log 2026-03-24T17:34:58.663 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78721.log: /var/log/ceph/ceph-client.admin.69464.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44223.log 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69464.log.gz 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.78721.log.gz 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33817.log.gz 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42485.log 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44223.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72058.log 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44223.log.gz 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42485.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66658.log 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.42485.log.gz 2026-03-24T17:34:58.664 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72058.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40956.log 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72058.log.gz 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66658.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88630.log 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66658.log.gz 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40956.log.gz 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88195.log 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38793.log 2026-03-24T17:34:58.665 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88195.log: 59.1% -- replaced with /var/log/ceph/ceph-client.admin.88195.log.gz 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45339.log 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88630.log.gz 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32416.log 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.38793.log.gz 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45339.log.gz 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35519.log 2026-03-24T17:34:58.666 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36388.log 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32416.log.gz 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35519.log.gz 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31070.log 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36388.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33557.log 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36388.log.gz 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71070.log 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31070.log.gz 2026-03-24T17:34:58.667 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80816.log 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33557.log: /var/log/ceph/ceph-client.admin.71070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75664.log 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71070.log.gz 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33557.log.gz 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80816.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80816.log.gz 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34137.log 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60994.log 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75664.log: /var/log/ceph/ceph-client.admin.34137.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65144.log 2026-03-24T17:34:58.668 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34137.log.gz 2026-03-24T17:34:58.669 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75664.log.gz 2026-03-24T17:34:58.669 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60994.log.gz 2026-03-24T17:34:58.669 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43343.log 2026-03-24T17:34:58.669 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70108.log 2026-03-24T17:34:58.669 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37525.log 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65144.log: /var/log/ceph/ceph-client.admin.43343.log: /var/log/ceph/ceph-client.admin.70108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65144.log.gz 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.43343.log.gz 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.70108.log.gz 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42048.log 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37525.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79778.log 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42048.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37525.log.gz 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53803.log 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.42048.log.gz 2026-03-24T17:34:58.670 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79778.log.gz 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48097.log 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53803.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47220.log 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53803.log.gz 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48097.log.gz 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57262.log 2026-03-24T17:34:58.671 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57345.log 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47220.log: /var/log/ceph/ceph-client.admin.57262.log: gzip -5 --verbose -- /var/log/ceph/ceph.tmp-client.admin.19131.log 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57262.log.gz 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47220.log.gz 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57345.log.gz 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77229.log 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47808.log 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph.tmp-client.admin.19131.log: /var/log/ceph/ceph-client.admin.77229.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83531.log 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77229.log.gz 2026-03-24T17:34:58.672 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph.tmp-client.admin.19131.log.gz 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47808.log.gz 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71474.log 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68372.log 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83531.log: /var/log/ceph/ceph-client.admin.71474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59057.log 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.71474.log.gz 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83531.log.gz 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68372.log.gz 2026-03-24T17:34:58.673 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57134.log 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82380.log 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59057.log: /var/log/ceph/ceph-client.admin.57134.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26330.log 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57134.log.gz 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59057.log.gz 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82380.log.gz 2026-03-24T17:34:58.674 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69356.log 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63237.log 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26330.log: /var/log/ceph/ceph-client.admin.69356.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82537.log 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69356.log.gz 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26330.log.gz 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63237.log.gz 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56682.log 2026-03-24T17:34:58.675 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31479.log 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82537.log: /var/log/ceph/ceph-client.admin.56682.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85566.log 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56682.log.gz 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82537.log.gz 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31479.log.gz 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72423.log 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40148.log 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85566.log: /var/log/ceph/ceph-client.admin.72423.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52881.log 2026-03-24T17:34:58.676 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72423.log.gz 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85566.log.gz 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40148.log.gz 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67426.log 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64161.log 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52881.log: /var/log/ceph/ceph-client.admin.67426.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47912.log 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67426.log.gz 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52881.log.gz 2026-03-24T17:34:58.677 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64161.log.gz 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77756.log 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37483.log 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47912.log: /var/log/ceph/ceph-client.admin.77756.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81053.log 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77756.log.gz 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47912.log.gz 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37483.log.gz 2026-03-24T17:34:58.678 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39129.log 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49190.log 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81053.log: /var/log/ceph/ceph-client.admin.39129.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27463.log 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.81053.log.gz 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39129.log.gz/var/log/ceph/ceph-client.admin.49190.log: 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49190.log.gz 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74525.log 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27463.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29390.log 2026-03-24T17:34:58.679 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27463.log.gz 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74525.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61775.log 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74525.log.gz 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49147.log 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29390.log: /var/log/ceph/ceph-client.admin.61775.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53290.log 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61775.log.gz 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.29390.log.gz/var/log/ceph/ceph-client.admin.49147.log: 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49147.log.gz 2026-03-24T17:34:58.680 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67448.log 2026-03-24T17:34:58.681 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70881.log 2026-03-24T17:34:58.681 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53290.log: /var/log/ceph/ceph-client.admin.67448.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78825.log 2026-03-24T17:34:58.681 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67448.log.gz 2026-03-24T17:34:58.681 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53290.log.gz 2026-03-24T17:34:58.681 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70881.log.gz 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75189.log 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35587.log 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78825.log: /var/log/ceph/ceph-client.admin.75189.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72743.log 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75189.log.gz 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78825.log.gz 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35587.log.gz 2026-03-24T17:34:58.682 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67125.log 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43662.log 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72743.log: /var/log/ceph/ceph-client.admin.67125.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45965.log 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67125.log.gz 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72743.log.gz 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43662.log.gz 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74974.log 2026-03-24T17:34:58.683 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59983.log 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45965.log: /var/log/ceph/ceph-client.admin.74974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56447.log 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74974.log.gz 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45965.log.gz 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59983.log.gz 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28569.log 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41106.log 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56447.log: /var/log/ceph/ceph-client.admin.28569.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47992.log 2026-03-24T17:34:58.684 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28569.log.gz 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56447.log.gz 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41106.log.gz 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46137.log 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86622.log 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47992.log: /var/log/ceph/ceph-client.admin.46137.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32910.log 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46137.log.gz 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47992.log.gz 2026-03-24T17:34:58.685 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86622.log.gz 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27018.log 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26416.log 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32910.log: /var/log/ceph/ceph-client.admin.27018.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39857.log 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27018.log.gz 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32910.log.gz 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26416.log.gz 2026-03-24T17:34:58.686 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82515.log 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55624.log 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39857.log: /var/log/ceph/ceph-client.admin.82515.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65673.log 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82515.log.gz 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39857.log.gz 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55624.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55624.log.gz 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32501.log 2026-03-24T17:34:58.687 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49428.log 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65673.log: /var/log/ceph/ceph-client.admin.32501.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51027.log 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65673.log.gz 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32501.log.gz 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49428.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49428.log.gz 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73318.log 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51027.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33149.log 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51027.log.gz 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73318.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30834.log 2026-03-24T17:34:58.688 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73318.log.gz 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42569.log 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33149.log: /var/log/ceph/ceph-client.admin.30834.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48347.log 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30834.log.gz 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33149.log.gz/var/log/ceph/ceph-client.admin.42569.log: 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42569.log.gz 2026-03-24T17:34:58.689 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79391.log 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58083.log 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48347.log: /var/log/ceph/ceph-client.admin.79391.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69313.log 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79391.log.gz 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48347.log.gz 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58083.log.gz 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42387.log 2026-03-24T17:34:58.690 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71049.log 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69313.log: /var/log/ceph/ceph-client.admin.42387.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40805.log 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69313.log.gz 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71049.log: 9.3% -- replaced with /var/log/ceph/ceph-client.admin.42387.log.gz 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr: 89.5% -- replaced with /var/log/ceph/ceph-client.admin.71049.log.gz 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38394.log 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40805.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59637.log 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40805.log.gz 2026-03-24T17:34:58.691 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38394.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64904.log 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.38394.log.gz 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47263.log 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59637.log: /var/log/ceph/ceph-client.admin.64904.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64884.log 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64904.log.gz 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr: 26.8%/var/log/ceph/ceph-client.admin.47263.log: -- replaced with /var/log/ceph/ceph-client.admin.59637.log.gz 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47263.log.gz 2026-03-24T17:34:58.692 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68243.log 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51242.log 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64884.log: /var/log/ceph/ceph-client.admin.68243.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52967.log 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68243.log.gz 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64884.log.gz 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51242.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51242.log.gz 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46413.log 2026-03-24T17:34:58.693 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37735.log 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52967.log: /var/log/ceph/ceph-client.admin.46413.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70343.log 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46413.log.gz 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52967.log.gz 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37735.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57299.log 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37735.log.gz 2026-03-24T17:34:58.694 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40620.log 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70343.log: /var/log/ceph/ceph-client.admin.57299.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69980.log 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57299.log.gz 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70343.log.gz 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40620.log.gz 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88566.log 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36371.log 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69980.log: /var/log/ceph/ceph-client.admin.88566.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36218.log 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88566.log.gz 2026-03-24T17:34:58.695 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69980.log.gz 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36371.log.gz 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42124.log 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57026.log 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36218.log: /var/log/ceph/ceph-client.admin.42124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36303.log 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42124.log.gz 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36218.log.gz 2026-03-24T17:34:58.696 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57026.log.gz 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65300.log 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73906.log 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36303.log: /var/log/ceph/ceph-client.admin.65300.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58599.log 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65300.log.gz 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36303.log.gz 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34057.log 2026-03-24T17:34:58.697 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73906.log.gz 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58599.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60155.log 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58599.log.gz 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34057.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64984.log 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34057.log.gz 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60155.log.gz 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68721.log 2026-03-24T17:34:58.698 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59236.log 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64984.log: /var/log/ceph/ceph-client.admin.68721.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34517.log 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68721.log.gz 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64984.log.gz 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59236.log.gz 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68659.log 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76278.log 2026-03-24T17:34:58.699 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34517.log: /var/log/ceph/ceph-client.admin.68659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54527.log 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68659.log.gz 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34517.log.gz 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76278.log.gz 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79133.log 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26717.log 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54527.log: /var/log/ceph/ceph-client.admin.79133.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46219.log 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79133.log.gz 2026-03-24T17:34:58.700 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54527.log.gz 2026-03-24T17:34:58.701 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26717.log.gz 2026-03-24T17:34:58.701 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79713.log 2026-03-24T17:34:58.701 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60929.log 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46219.log: /var/log/ceph/ceph-client.admin.79713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79713.log.gz 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60929.log.gz 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81899.log 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46219.log.gz 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31542.log 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81899.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55129.log 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81899.log.gz 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31542.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47548.log 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31542.log.gz 2026-03-24T17:34:58.702 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72101.log 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55129.log: /var/log/ceph/ceph-client.admin.47548.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32126.log 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55129.log.gz 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72101.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.47548.log.gz 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72101.log.gz 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56321.log 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32126.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35604.log 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56321.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32126.log.gz 2026-03-24T17:34:58.703 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56321.log.gz 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34097.log 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35604.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32961.log 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35604.log.gz 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34097.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27387.log 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34097.log.gz 2026-03-24T17:34:58.704 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72531.log 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32961.log: /var/log/ceph/ceph-client.admin.27387.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60284.log 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27387.log.gz 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr: 1.2%/var/log/ceph/ceph-client.admin.72531.log: -- replaced with /var/log/ceph/ceph-client.admin.32961.log.gz 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72531.log.gz 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46593.log 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60284.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54282.log 2026-03-24T17:34:58.705 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60284.log.gz 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46593.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34537.log 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46593.log.gz 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54282.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54742.log 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.54282.log.gz 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29198.log 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34537.log: /var/log/ceph/ceph-client.admin.54742.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77844.log 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54742.log.gz 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34537.log.gz 2026-03-24T17:34:58.706 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29198.log.gz 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74030.log 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30702.log 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77844.log: /var/log/ceph/ceph-client.admin.74030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48367.log 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74030.log.gz 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77844.log.gz 2026-03-24T17:34:58.707 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30702.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52645.log 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.30702.log.gz 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34377.log 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48367.log: 0.0%/var/log/ceph/ceph-client.admin.52645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75686.log 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52645.log.gz 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.48367.log.gz 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34377.log.gz 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33098.log 2026-03-24T17:34:58.708 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31113.log 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75686.log: /var/log/ceph/ceph-client.admin.33098.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33286.log 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75686.log.gz 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33098.log.gz 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31113.log.gz 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28630.log 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35319.log 2026-03-24T17:34:58.709 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33286.log.gz 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35928.log 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28630.log: /var/log/ceph/ceph-client.admin.35319.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74244.log 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35319.log.gz 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.28630.log.gz 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35928.log.gz 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49600.log 2026-03-24T17:34:58.710 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42691.log 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74244.log: /var/log/ceph/ceph-client.admin.49600.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90687.log 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49600.log.gz 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74244.log.gz 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42691.log.gz 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83832.log 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90687.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61509.log 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90687.log.gz 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83832.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65397.log 2026-03-24T17:34:58.711 INFO:teuthology.orchestra.run.vm01.stderr: 27.6% -- replaced with /var/log/ceph/ceph-client.admin.83832.log.gz 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65802.log 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61509.log: /var/log/ceph/ceph-client.admin.65397.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65261.log 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65397.log.gz 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61509.log.gz 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65802.log.gz 2026-03-24T17:34:58.712 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67469.log 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50812.log 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65261.log: /var/log/ceph/ceph-client.admin.67469.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45296.log 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67469.log.gz 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65261.log.gz 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50812.log.gz 2026-03-24T17:34:58.713 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87031.log 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40186.log 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45296.log: /var/log/ceph/ceph-client.admin.87031.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88673.log 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87031.log.gz 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45296.log.gz 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40186.log.gz 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71715.log 2026-03-24T17:34:58.714 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84311.log 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88673.log: /var/log/ceph/ceph-client.admin.71715.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51816.log 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71715.log.gz 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88673.log.gz 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84311.log.gz 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61424.log 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28342.log 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51816.log: /var/log/ceph/ceph-client.admin.61424.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25492.log 2026-03-24T17:34:58.715 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61424.log.gz 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr: 57.2% -- replaced with /var/log/ceph/ceph-client.admin.51816.log.gz 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28342.log.gz 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89426.log 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88974.log 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25492.log: /var/log/ceph/ceph-client.admin.89426.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37315.log 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89426.log.gz 2026-03-24T17:34:58.716 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25492.log.gz 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88974.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88974.log.gz 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40934.log 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79455.log 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37315.log: /var/log/ceph/ceph-client.admin.40934.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64652.log 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40934.log.gz 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79455.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.37315.log.gz 2026-03-24T17:34:58.717 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79455.log.gz 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46183.log 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88609.log 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64652.log: /var/log/ceph/ceph-client.admin.46183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59275.log 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46183.log.gz 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr: 29.7% -- replaced with /var/log/ceph/ceph-client.admin.64652.log.gz 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88609.log.gz 2026-03-24T17:34:58.718 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63279.log 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58470.log 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59275.log: /var/log/ceph/ceph-client.admin.63279.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64141.log 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63279.log.gz 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59275.log.gz 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58470.log.gz 2026-03-24T17:34:58.719 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66081.log 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73924.log 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64141.log: /var/log/ceph/ceph-client.admin.66081.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80799.log 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66081.log.gz 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64141.log.gz 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73924.log.gz 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55474.log 2026-03-24T17:34:58.720 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79176.log 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80799.log: /var/log/ceph/ceph-client.admin.55474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62236.log 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55474.log.gz 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80799.log.gz 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79176.log.gz 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84574.log 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47751.log 2026-03-24T17:34:58.721 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62236.log: /var/log/ceph/ceph-client.admin.84574.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84574.log.gz 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62236.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64439.log 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47751.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31242.log 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47751.log.gz 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49384.log 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64439.log.gz 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39771.log 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31242.log: /var/log/ceph/ceph-client.admin.49384.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57198.log 2026-03-24T17:34:58.722 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49384.log.gz 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31242.log.gz 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39771.log.gz 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63298.log 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67297.log 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57198.log: /var/log/ceph/ceph-client.admin.63298.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84768.log 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63298.log.gz 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57198.log.gz 2026-03-24T17:34:58.723 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67297.log.gz 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88781.log 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37945.log 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84768.log: /var/log/ceph/ceph-client.admin.88781.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39663.log 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88781.log.gz 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84768.log.gz 2026-03-24T17:34:58.724 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37945.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78791.log 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr: 52.6% -- replaced with /var/log/ceph/ceph-client.admin.37945.log.gz 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36780.log 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39663.log: /var/log/ceph/ceph-client.admin.78791.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29326.log 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78791.log.gz 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39663.log.gz 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36780.log.gz 2026-03-24T17:34:58.725 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45167.log 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41968.log 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29326.log: /var/log/ceph/ceph-client.admin.45167.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69851.log 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45167.log.gz 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29326.log.gz 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41968.log.gz 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56195.log 2026-03-24T17:34:58.726 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27976.log 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69851.log: /var/log/ceph/ceph-client.admin.56195.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63568.log 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56195.log.gz 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69851.log.gz 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27976.log.gz 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76627.log 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45728.log 2026-03-24T17:34:58.727 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63568.log: /var/log/ceph/ceph-client.admin.76627.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51113.log 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76627.log.gz 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.63568.log.gz 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45728.log.gz 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67147.log 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51113.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.51113.log.gz /var/log/ceph/ceph-client.admin.36183.log 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72685.log 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67147.log: /var/log/ceph/ceph-client.admin.36183.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81251.log 2026-03-24T17:34:58.728 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36183.log.gz 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67147.log.gz 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72685.log.gz 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45814.log 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29971.log 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81251.log: /var/log/ceph/ceph-client.admin.45814.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88953.log 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45814.log.gz 2026-03-24T17:34:58.729 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81251.log.gz 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29971.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29971.log.gz 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34397.log 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67555.log 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88953.log: /var/log/ceph/ceph-client.admin.34397.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79369.log 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34397.log.gz 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88953.log.gz 2026-03-24T17:34:58.730 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67555.log.gz 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82152.log 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74440.log 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79369.log: /var/log/ceph/ceph-client.admin.82152.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47285.log 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82152.log.gz 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79369.log.gz 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74440.log.gz 2026-03-24T17:34:58.731 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76713.log 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72165.log 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47285.log: /var/log/ceph/ceph-client.admin.76713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76713.log.gz 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47628.log 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47285.log.gz 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72165.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47478.log 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72165.log.gz 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47628.log.gz 2026-03-24T17:34:58.732 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56154.log 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65004.log 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47478.log: /var/log/ceph/ceph-client.admin.56154.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76670.log 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56154.log.gz 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47478.log.gz 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65004.log.gz 2026-03-24T17:34:58.733 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38289.log 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32859.log 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76670.log: /var/log/ceph/ceph-client.admin.38289.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35826.log 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76670.log.gz 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32859.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.38289.log.gz 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54116.log 2026-03-24T17:34:58.734 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32859.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35826.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.61833.log 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35826.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64553.log 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54116.log: /var/log/ceph/ceph-client.admin.61833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61833.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53358.log 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54116.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64553.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr: 92.1%gzip -- replaced with /var/log/ceph/ceph-mgr.x.log.gz 2026-03-24T17:34:58.735 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.31500.log 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31436.log 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53358.log: /var/log/ceph/ceph-client.admin.31500.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50855.log 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31500.log.gz 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53358.log.gz 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81727.log 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31436.log.gz 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50855.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73664.log 2026-03-24T17:34:58.736 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50855.log.gz 2026-03-24T17:34:58.737 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81727.log.gz 2026-03-24T17:34:58.737 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33029.log 2026-03-24T17:34:58.737 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83552.log 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73664.log: /var/log/ceph/ceph-client.admin.33029.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77932.log 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73664.log.gz 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33029.log.gz 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83552.log.gz 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73790.log 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77932.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31199.log 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77932.log.gz 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73790.log.gz 2026-03-24T17:34:58.738 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59940.log 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31199.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32757.log 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31199.log.gz 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59940.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76094.log 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59940.log.gz 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32757.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64258.log 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32757.log.gz 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5/var/log/ceph/ceph-client.admin.76094.log: --verbose -- /var/log/ceph/ceph-client.admin.27425.log 2026-03-24T17:34:58.739 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76094.log.gz 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64258.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40461.log 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64258.log.gz 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27425.log.gz 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62958.log 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29541.log 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40461.log.gz 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62958.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46962.log 2026-03-24T17:34:58.740 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62958.log.gz 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29541.log.gz 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59452.log 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38016.log 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46962.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46962.log.gz 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59452.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77405.log 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59452.log.gz 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38016.log.gz 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85240.log 2026-03-24T17:34:58.741 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77405.log.gz 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80296.log 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85240.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48159.log 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85240.log.gz 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80296.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75514.log 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80296.log.gz 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34077.log 2026-03-24T17:34:58.742 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48159.log.gz 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75514.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44608.log 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75514.log.gz 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34077.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44403.log 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34077.log.gz 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44608.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88150.log 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44608.log.gz 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44403.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38247.log 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88150.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35741.log 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.44403.log.gz 2026-03-24T17:34:58.743 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88150.log.gz 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38247.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51756.log 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38247.log.gz 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35741.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86386.log 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35741.log.gz 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51756.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53161.log 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51756.log.gz 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86386.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73554.log 2026-03-24T17:34:58.744 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86386.log.gz 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53161.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48451.log 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53161.log.gz 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73554.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38856.log 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73554.log.gz 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48451.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54379.log 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38856.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.48451.log.gz 2026-03-24T17:34:58.745 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38856.log.gz 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55732.log 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55237.log 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54379.log.gz 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55732.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86536.log 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55732.log.gz 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55237.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59196.log 2026-03-24T17:34:58.746 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55237.log.gz 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86536.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37504.log 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86536.log.gz 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59196.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57241.log 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59196.log.gz 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59257.log 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57241.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53032.log 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57241.log.gz 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37504.log.gz 2026-03-24T17:34:58.747 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41659.log 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59257.log.gz 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76864.log 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53032.log: /var/log/ceph/ceph-client.admin.41659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59801.log 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41659.log.gz 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53032.log.gz 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76864.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48869.log 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76864.log.gz 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59801.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59801.log.gz -5 2026-03-24T17:34:58.748 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.26942.log 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48869.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30207.log 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48869.log.gz 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26942.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62377.log 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30207.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26942.log.gz 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30207.log.gz 2026-03-24T17:34:58.749 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70000.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62377.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57890.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62377.log.gz 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80189.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.70000.log.gz 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57890.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61252.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57890.log.gz 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80189.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75253.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80189.log.gz 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61252.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60951.log 2026-03-24T17:34:58.750 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61252.log.gz 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75253.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43240.log 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75253.log.gz 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60951.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43464.log 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60951.log.gz 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43240.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47709.log 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43240.log.gz 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43464.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76159.log 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43464.log.gz 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47709.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70408.log 2026-03-24T17:34:58.751 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47709.log.gz 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76159.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43086.log 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76159.log.gz 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70408.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74673.log 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70408.log.gz 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43086.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72294.log 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43086.log.gz 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74673.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72036.log 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74673.log.gz 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72294.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44003.log 2026-03-24T17:34:58.752 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72294.log.gz 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72036.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68179.log 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72036.log.gz 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44003.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56833.log 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44003.log.gz 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68179.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44503.log 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68179.log.gz 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56833.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30229.log 2026-03-24T17:34:58.753 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56833.log.gz 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44503.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33320.log 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30229.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56897.log 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30229.log.gz 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.44503.log.gz 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33320.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68200.log 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33320.log.gz 2026-03-24T17:34:58.754 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28961.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56897.log: /var/log/ceph/ceph-client.admin.68200.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25646.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68200.log.gz 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56897.log.gz 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28961.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52623.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28961.log.gz 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62765.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25646.log.gz 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52623.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31006.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52623.log.gz 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62765.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68867.log 2026-03-24T17:34:58.755 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62765.log.gz 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80765.log 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31006.log.gz 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68867.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88824.log 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.68867.log.gz 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80765.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30510.log 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80765.log.gz 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88824.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54299.log 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88824.log.gz 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30510.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36967.log 2026-03-24T17:34:58.756 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30510.log.gz 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54299.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54218.log 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54299.log.gz 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36967.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89404.log 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36967.log.gz 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54218.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41386.log 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54218.log.gz 2026-03-24T17:34:58.757 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89404.log.gz 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66763.log 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41386.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83225.log 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41386.log.gz 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66763.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81620.log 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66763.log.gz 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83225.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76649.log 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83225.log.gz 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81620.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34919.log 2026-03-24T17:34:58.758 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81620.log.gz 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76649.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47648.log 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76649.log.gz 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34919.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72940.log 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34919.log.gz 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47648.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39341.log 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr: 28.9% -- replaced with /var/log/ceph/ceph-client.admin.47648.log.gz 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72940.log.gz 2026-03-24T17:34:58.759 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55064.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39341.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45188.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39341.log.gz 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55064.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40999.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55064.log.gz 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45188.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69593.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45188.log.gz 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40999.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59139.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40999.log.gz 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69593.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74013.log 2026-03-24T17:34:58.760 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69593.log.gz 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59139.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77143.log 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59139.log.gz 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74013.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50683.log 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74013.log.gz 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77143.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33937.log 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77143.log.gz 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50683.log.gz 2026-03-24T17:34:58.761 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37147.log 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33937.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50407.log 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33937.log.gz 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37147.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43443.log 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50407.log.gz 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37147.log.gz 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77186.log 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43443.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74694.log 2026-03-24T17:34:58.762 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43443.log.gz 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77186.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46919.log 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77186.log.gz 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74694.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86211.log 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74694.log.gz 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46919.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64731.log 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46919.log.gz 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86211.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80210.log 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86211.log.gz 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64731.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31049.log 2026-03-24T17:34:58.763 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64731.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80210.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74866.log 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80210.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31049.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43542.log 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31049.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74866.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84940.log 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74866.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43542.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72402.log 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84940.log: 56.0%gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43542.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84940.log.gz 2026-03-24T17:34:58.764 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.37882.log 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72402.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.81491.log 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.72402.log.gz 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69679.log 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37882.log: /var/log/ceph/ceph-client.admin.81491.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85953.log 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81491.log.gz 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69679.log.gz 2026-03-24T17:34:58.765 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85372.log 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37882.log.gz 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85953.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37567.log 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85953.log.gz 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85372.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40742.log 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85372.log.gz 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37567.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44203.log 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40742.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37567.log.gz 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76030.log 2026-03-24T17:34:58.766 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40742.log.gz 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44203.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50747.log 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr: 31.5% -- replaced with /var/log/ceph/ceph-client.admin.44203.log.gz 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34799.log 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76030.log.gz 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50747.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84961.log 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50747.log.gz 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34799.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36797.log 2026-03-24T17:34:58.767 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34799.log.gz 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84961.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.29799.log 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84961.log.gz 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36797.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29842.log 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36797.log.gz 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29799.log: gzip -5 --verbose -- /var/log/ceph/ceph.log 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29799.log.gz 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29842.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88480.log 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29842.log.gz 2026-03-24T17:34:58.768 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40501.log 2026-03-24T17:34:58.769 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88480.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74457.log 2026-03-24T17:34:58.769 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88480.log.gz 2026-03-24T17:34:58.769 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40501.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72466.log 2026-03-24T17:34:58.769 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40501.log.gz 2026-03-24T17:34:58.769 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86185.log 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74457.log: /var/log/ceph/ceph-client.admin.72466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72466.log.gz 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44523.log 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74457.log.gz 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86185.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38058.log 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86185.log.gz 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44523.log.gz 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38877.log 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89898.log 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38058.log: 88.9% -- replaced with /var/log/ceph/ceph.log.gz 2026-03-24T17:34:58.770 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38877.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35673.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38877.log.gz 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 26.1%/var/log/ceph/ceph-client.admin.89898.log: -- replaced with /var/log/ceph/ceph-client.admin.38058.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83574.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89898.log.gz 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35673.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70558.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35673.log.gz 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83574.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40400.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83574.log.gz 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70558.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40479.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70558.log.gz 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40400.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79079.log 2026-03-24T17:34:58.771 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40400.log.gz 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40479.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62142.log 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40479.log.gz 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79079.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45771.log 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79079.log.gz 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62142.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80425.log 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62142.log.gz 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45771.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63162.log 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45771.log.gz 2026-03-24T17:34:58.772 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80425.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85931.log 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80425.log.gz 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63162.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53478.log 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63162.log.gz 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85931.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76950.log 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85931.log.gz 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53478.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88282.log 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53478.log.gz 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76950.log.gz 2026-03-24T17:34:58.773 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48327.log 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88282.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46342.log 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88282.log.gz 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48327.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37903.log 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48327.log.gz 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46342.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69163.log 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46342.log.gz 2026-03-24T17:34:58.774 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37903.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42187.log 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69163.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37903.log.gz 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69163.log.gz 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39878.log 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42187.log: gzip -5 --verbose -- /var/log/ceph/ceph-osd.0.log 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39878.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.42187.log.gz 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85910.log 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39878.log.gz 2026-03-24T17:34:58.775 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23038.log 2026-03-24T17:34:58.776 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31221.log 2026-03-24T17:34:58.776 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85910.log: /var/log/ceph/ceph-client.admin.23038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85910.log.gz 2026-03-24T17:34:58.776 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.23038.log.gz 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38541.log 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31221.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31221.log.gz -5 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.35860.log 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38541.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81297.log 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38541.log.gz 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35860.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80683.log 2026-03-24T17:34:58.778 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35860.log.gz 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81297.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86601.log 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81297.log.gz 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80683.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68265.log 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80683.log.gz 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57005.log 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86601.log: /var/log/ceph/ceph-client.admin.68265.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70923.log 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68265.log.gz 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86601.log.gz 2026-03-24T17:34:58.779 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57005.log.gz 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43024.log 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74105.log 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70923.log: /var/log/ceph/ceph-client.admin.43024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69872.log 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70923.log.gz 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74105.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74105.log.gz 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr: 53.9% -- replaced with /var/log/ceph/ceph-client.admin.43024.log.gz 2026-03-24T17:34:58.780 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36098.log 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36320.log 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69872.log.gz 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36098.log.gz 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78450.log 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55710.log 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78450.log.gz 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36320.log.gz 2026-03-24T17:34:58.781 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42086.log 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55710.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66189.log 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55710.log.gz 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42086.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80937.log 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42086.log.gz 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59315.log 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66189.log: /var/log/ceph/ceph-client.admin.80937.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46115.log 2026-03-24T17:34:58.782 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80937.log.gz 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66189.log.gz 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59315.log: 11.6% -- replaced with /var/log/ceph/ceph-client.admin.59315.log.gz 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70150.log 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68824.log 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46115.log: /var/log/ceph/ceph-client.admin.70150.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54548.log 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70150.log.gz 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46115.log.gz 2026-03-24T17:34:58.783 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68824.log.gz 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88910.log 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81276.log 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54548.log: /var/log/ceph/ceph-client.admin.88910.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86558.log 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88910.log.gz 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54548.log.gz 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81276.log.gz 2026-03-24T17:34:58.784 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36984.log 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71907.log 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86558.log: /var/log/ceph/ceph-client.admin.36984.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90505.log 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36984.log.gz 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86558.log.gz 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71907.log.gz 2026-03-24T17:34:58.785 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45253.log 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27804.log 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90505.log: /var/log/ceph/ceph-client.admin.45253.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79115.log 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45253.log.gz 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90505.log.gz 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27804.log.gz 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27255.log 2026-03-24T17:34:58.786 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35843.log 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79115.log: /var/log/ceph/ceph-client.admin.27255.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66516.log 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27255.log.gz 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79115.log.gz 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35843.log.gz 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81319.log 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90040.log 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66516.log: /var/log/ceph/ceph-client.admin.81319.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62564.log 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81319.log.gz 2026-03-24T17:34:58.787 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66516.log.gz 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90040.log.gz 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82028.log 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78236.log 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62564.log: /var/log/ceph/ceph-client.admin.82028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89168.log 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82028.log.gz 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62564.log.gz 2026-03-24T17:34:58.788 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78236.log.gz 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35809.log 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82902.log 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89168.log: /var/log/ceph/ceph-client.admin.35809.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80361.log 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35809.log.gz 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89168.log.gz 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82902.log.gz 2026-03-24T17:34:58.789 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49233.log 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69143.log 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80361.log: /var/log/ceph/ceph-client.admin.49233.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45857.log 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49233.log.gz 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80361.log.gz 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69143.log.gz 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76971.log 2026-03-24T17:34:58.790 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57848.log 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45857.log: /var/log/ceph/ceph-client.admin.76971.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84531.log 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76971.log.gz 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45857.log.gz 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57848.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63355.log 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.57848.log.gz 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65024.log 2026-03-24T17:34:58.791 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84531.log: /var/log/ceph/ceph-client.admin.63355.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59039.log 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63355.log.gz 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84531.log.gz 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65024.log.gz 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55366.log 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59039.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50387.log 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59039.log.gz 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55366.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46430.log 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55366.log.gz 2026-03-24T17:34:58.792 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48890.log 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50387.log: /var/log/ceph/ceph-client.admin.46430.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85176.log 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46430.log.gz 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50387.log.gz 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48890.log.gz 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59721.log 2026-03-24T17:34:58.793 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49018.log 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85176.log: /var/log/ceph/ceph-client.admin.59721.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26180.log 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59721.log.gz 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85176.log.gz 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49018.log.gz 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71688.log 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67577.log 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26180.log: /var/log/ceph/ceph-client.admin.71688.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82730.log 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71688.log.gz 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26180.log.gz 2026-03-24T17:34:58.794 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67577.log.gz 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72922.log 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73226.log 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82730.log: /var/log/ceph/ceph-client.admin.72922.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35638.log 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72922.log.gz 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.82730.log.gz 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73226.log.gz 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32348.log 2026-03-24T17:34:58.795 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35638.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75492.log 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35638.log.gz 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32348.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72316.log 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32348.log.gz 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57155.log 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75492.log: /var/log/ceph/ceph-client.admin.72316.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62256.log 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72316.log.gz 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75492.log.gz 2026-03-24T17:34:58.796 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57155.log.gz 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82257.log 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32246.log 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62256.log: /var/log/ceph/ceph-client.admin.82257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55388.log 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62256.log.gz 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr: 84.2% -- replaced with /var/log/ceph/ceph-client.admin.82257.log.gz 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32246.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40848.log 2026-03-24T17:34:58.797 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32246.log.gz 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83096.log 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55388.log: /var/log/ceph/ceph-client.admin.40848.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78257.log 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40848.log.gz 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55388.log.gz 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83096.log.gz 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77014.log 2026-03-24T17:34:58.798 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65952.log 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78257.log: /var/log/ceph/ceph-client.admin.77014.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69104.log 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77014.log.gz 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78257.log.gz 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65952.log.gz 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72079.log 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46769.log 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69104.log: /var/log/ceph/ceph-client.admin.72079.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70279.log 2026-03-24T17:34:58.799 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72079.log.gz 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69104.log.gz 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46769.log.gz 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71110.log 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44952.log 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70279.log: /var/log/ceph/ceph-client.admin.71110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81706.log 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71110.log.gz 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70279.log.gz 2026-03-24T17:34:58.800 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44952.log.gz 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89748.log 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34597.log 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81706.log: /var/log/ceph/ceph-client.admin.89748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82169.log 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89748.log.gz 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81706.log.gz 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34597.log.gz 2026-03-24T17:34:58.801 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85974.log 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64046.log 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82169.log: /var/log/ceph/ceph-client.admin.85974.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33537.log 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85974.log.gz 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82169.log.gz 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64046.log.gz 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67061.log 2026-03-24T17:34:58.802 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41617.log 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33537.log: /var/log/ceph/ceph-client.admin.67061.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41364.log 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67061.log.gz 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33537.log.gz 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41617.log.gz 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26886.log 2026-03-24T17:34:58.803 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73044.log 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41364.log: /var/log/ceph/ceph-client.admin.26886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33251.log 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26886.log.gz 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41364.log.gz 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73044.log.gz 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26223.log 2026-03-24T17:34:58.804 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53139.log 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53139.log.gz 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85308.log 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26223.log: /var/log/ceph/ceph-client.admin.33251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26223.log.gz 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77863.log 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33251.log.gz 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85308.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62606.log 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85308.log.gz 2026-03-24T17:34:58.805 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77863.log.gz 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80339.log 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77208.log 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62606.log: /var/log/ceph/ceph-client.admin.80339.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82203.log 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80339.log.gz 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62606.log.gz 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77208.log.gz 2026-03-24T17:34:58.806 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82773.log 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66038.log 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82203.log: /var/log/ceph/ceph-client.admin.82773.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61209.log 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82773.log.gz 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.82203.log.gz 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66038.log.gz 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75643.log 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61209.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81813.log 2026-03-24T17:34:58.807 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61209.log.gz 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75643.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35359.log 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75643.log.gz 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88802.log 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81813.log: /var/log/ceph/ceph-client.admin.35359.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27868.log 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35359.log.gz 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81813.log.gz 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88802.log.gz 2026-03-24T17:34:58.808 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44144.log 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48034.log 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27868.log: /var/log/ceph/ceph-client.admin.44144.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26502.log 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44144.log.gz 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27868.log.gz 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48034.log.gz 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61621.log 2026-03-24T17:34:58.809 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81210.log 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26502.log: /var/log/ceph/ceph-client.admin.61621.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42816.log 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61621.log.gz 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26502.log.gz 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81210.log.gz 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41783.log 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60456.log 2026-03-24T17:34:58.810 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42816.log: /var/log/ceph/ceph-client.admin.41783.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52559.log 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41783.log.gz 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42816.log.gz 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60456.log.gz 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86450.log 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83359.log 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52559.log: /var/log/ceph/ceph-client.admin.86450.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74780.log 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86450.log.gz 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52559.log.gz 2026-03-24T17:34:58.811 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83359.log.gz 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58384.log 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65184.log 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74780.log: /var/log/ceph/ceph-client.admin.58384.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72445.log 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58384.log.gz 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74780.log.gz 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65184.log: 55.3% -- replaced with /var/log/ceph/ceph-client.admin.65184.log.gz 2026-03-24T17:34:58.812 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30744.log 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54177.log 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72445.log: /var/log/ceph/ceph-client.admin.30744.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84918.log 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30744.log.gz 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72445.log.gz 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54177.log.gz 2026-03-24T17:34:58.813 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77809.log 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48138.log 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84918.log: /var/log/ceph/ceph-client.admin.77809.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78686.log 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77809.log.gz 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84918.log.gz 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48138.log.gz 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56280.log 2026-03-24T17:34:58.814 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28711.log 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78686.log: /var/log/ceph/ceph-client.admin.56280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79906.log 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56280.log.gz 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78686.log.gz 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28711.log.gz 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38079.log 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86407.log 2026-03-24T17:34:58.815 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79906.log: /var/log/ceph/ceph-client.admin.38079.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41701.log 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79906.log.gz 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86407.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38079.log.gz 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86407.log.gz 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26545.log 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41701.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51177.log 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26545.log: 18.5% -- replaced with /var/log/ceph/ceph-client.admin.41701.log.gz 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26545.log.gz -5 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.85047.log 2026-03-24T17:34:58.816 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69765.log 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51177.log: /var/log/ceph/ceph-client.admin.85047.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87239.log 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85047.log.gz 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51177.log.gz 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69765.log.gz 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59510.log 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69808.log 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87239.log: /var/log/ceph/ceph-client.admin.59510.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52498.log 2026-03-24T17:34:58.817 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59510.log.gz 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87239.log.gz 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69808.log.gz 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65522.log 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56467.log 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52498.log: /var/log/ceph/ceph-client.admin.65522.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.65522.log.gz 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.35690.log 2026-03-24T17:34:58.818 INFO:teuthology.orchestra.run.vm01.stderr: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.52498.log.gz 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56467.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55581.log 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.56467.log.gz 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47027.log 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35690.log: /var/log/ceph/ceph-client.admin.55581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55581.log.gz 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37018.log 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35690.log.gz 2026-03-24T17:34:58.819 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47027.log.gz 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81577.log 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39213.log 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37018.log: /var/log/ceph/ceph-client.admin.81577.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81577.log.gz 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62646.log 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.37018.log.gz 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39213.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50448.log 2026-03-24T17:34:58.820 INFO:teuthology.orchestra.run.vm01.stderr: 26.3% -- replaced with /var/log/ceph/ceph-client.admin.39213.log.gz 2026-03-24T17:34:58.821 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67405.log 2026-03-24T17:34:58.821 INFO:teuthology.orchestra.run.vm01.stderr: 54.4% -- replaced with /var/log/ceph/ceph-client.admin.62646.log.gz 2026-03-24T17:34:58.821 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50448.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41278.log 2026-03-24T17:34:58.821 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50448.log.gz 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67405.log.gz 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31820.log 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36627.log 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41278.log: /var/log/ceph/ceph-client.admin.31820.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84371.log 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.41278.log.gz 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31820.log.gz 2026-03-24T17:34:58.822 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36627.log.gz 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41472.log 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84371.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55452.log 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84371.log.gz 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45124.log 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41472.log.gz 2026-03-24T17:34:58.823 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78563.log 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55452.log: /var/log/ceph/ceph-client.admin.45124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45124.log.gz 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48550.log 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55452.log.gz 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78563.log.gz 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43703.log 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51091.log 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48550.log: /var/log/ceph/ceph-client.admin.43703.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72617.log 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48550.log.gz 2026-03-24T17:34:58.824 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43703.log.gz 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51091.log.gz 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37693.log 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72617.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66390.log 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72617.log.gz 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46748.log 2026-03-24T17:34:58.825 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37189.log 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66390.log: 0.0%/var/log/ceph/ceph-client.admin.46748.log: -- replaced with /var/log/ceph/ceph-client.admin.66390.log.gz 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.37693.log.gz 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46748.log.gz 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72853.log 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86337.log 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37189.log: /var/log/ceph/ceph-client.admin.72853.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77456.log 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72853.log.gz 2026-03-24T17:34:58.826 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.37189.log.gz 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86337.log.gz 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49104.log 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25961.log 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77456.log: /var/log/ceph/ceph-client.admin.49104.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49104.log.gz 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35468.log 2026-03-24T17:34:58.827 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.77456.log.gz 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25961.log.gz 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56300.log 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74051.log 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35468.log: 0.0%/var/log/ceph/ceph-client.admin.56300.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74737.log 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35468.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56300.log.gz 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74051.log.gz 2026-03-24T17:34:58.828 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60736.log 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66698.log 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74737.log: /var/log/ceph/ceph-client.admin.60736.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28493.log 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60736.log.gz 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74737.log.gz 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66698.log.gz 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78488.log 2026-03-24T17:34:58.829 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63198.log 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28493.log: /var/log/ceph/ceph-client.admin.78488.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78488.log.gz 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54052.log 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.28493.log.gz 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63198.log.gz 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45685.log 2026-03-24T17:34:58.830 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26266.log 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54052.log: /var/log/ceph/ceph-client.admin.45685.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45685.log.gz 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63832.log 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.54052.log.gz 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26266.log.gz 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63873.log 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50366.log 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63832.log.gz 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63873.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28235.log 2026-03-24T17:34:58.831 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63873.log.gz 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50366.log.gz 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29069.log 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65341.log 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28235.log: /var/log/ceph/ceph-client.admin.29069.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29069.log.gz 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68846.log 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.28235.log.gz 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65341.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44565.log 2026-03-24T17:34:58.832 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65341.log.gz 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68846.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67663.log 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.68846.log.gz 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61187.log 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44565.log: /var/log/ceph/ceph-client.admin.67663.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67663.log.gz 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89791.log 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44565.log.gz 2026-03-24T17:34:58.833 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61187.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61187.log.gz 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31350.log 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71929.log 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89791.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85265.log 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31350.log: -- replaced with /var/log/ceph/ceph-client.admin.89791.log.gz 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31350.log.gz 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71929.log.gz 2026-03-24T17:34:58.834 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48804.log 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59354.log 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85265.log: /var/log/ceph/ceph-client.admin.48804.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67082.log 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48804.log.gz 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85265.log.gz 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59354.log.gz 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30531.log 2026-03-24T17:34:58.835 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89856.log 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67082.log: /var/log/ceph/ceph-client.admin.30531.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87052.log 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30531.log.gz 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67082.log.gz 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89856.log.gz 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89834.log 2026-03-24T17:34:58.836 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72719.log 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87052.log: /var/log/ceph/ceph-client.admin.89834.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66369.log 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87052.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89834.log.gz 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72719.log.gz 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46941.log 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66369.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57745.log 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66369.log.gz 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46941.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50428.log 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46941.log.gz 2026-03-24T17:34:58.837 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56596.log 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57745.log: /var/log/ceph/ceph-client.admin.50428.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33397.log 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50428.log.gz 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57745.log.gz 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56596.log.gz 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50554.log 2026-03-24T17:34:58.838 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44082.log 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33397.log: /var/log/ceph/ceph-client.admin.50554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50554.log.gz 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44651.log 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33397.log.gz 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44082.log.gz 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57459.log 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86124.log 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44651.log: /var/log/ceph/ceph-client.admin.57459.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67233.log 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57459.log.gz 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44651.log.gz 2026-03-24T17:34:58.839 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86124.log.gz 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57997.log 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54591.log 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67233.log: /var/log/ceph/ceph-client.admin.57997.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46517.log 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57997.log.gz 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67233.log.gz 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54591.log: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.54591.log.gz 2026-03-24T17:34:58.840 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32212.log 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27407.log 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46517.log: /var/log/ceph/ceph-client.admin.32212.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54463.log 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46517.log.gz 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32212.log.gz 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27407.log.gz 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60521.log 2026-03-24T17:34:58.841 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28897.log 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54463.log: /var/log/ceph/ceph-client.admin.60521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32109.log 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60521.log.gz 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.54463.log.gz/var/log/ceph/ceph-client.admin.28897.log: 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28897.log.gz 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44062.log 2026-03-24T17:34:58.842 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26137.log 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32109.log: /var/log/ceph/ceph-client.admin.44062.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74759.log 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26137.log: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.44062.log.gz 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32109.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26137.log.gz 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37035.log 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74354.log 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74759.log.gz 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37035.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39491.log 2026-03-24T17:34:58.843 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37035.log.gz 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78354.log 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74354.log: /var/log/ceph/ceph-client.admin.39491.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49815.log 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39491.log.gz 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74354.log.gz 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78354.log.gz 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53541.log 2026-03-24T17:34:58.844 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73807.log 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49815.log: /var/log/ceph/ceph-client.admin.53541.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40977.log 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53541.log.gz 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49815.log.gz 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73807.log.gz 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83139.log 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56725.log 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40977.log: /var/log/ceph/ceph-client.admin.83139.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74423.log 2026-03-24T17:34:58.845 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83139.log.gz 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40977.log.gz 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56725.log.gz 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33997.log 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57576.log 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74423.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79995.log 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74423.log.gz 2026-03-24T17:34:58.846 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33997.log.gz 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57576.log.gz 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68741.log 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50030.log 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79995.log: /var/log/ceph/ceph-client.admin.68741.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79995.log.gz 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70214.log 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.68741.log.gz 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66308.log 2026-03-24T17:34:58.847 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50030.log.gz 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46360.log 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70214.log: /var/log/ceph/ceph-client.admin.66308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66308.log.gz 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85544.log 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70214.log.gz 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74995.log 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46360.log.gz 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85544.log.gz 2026-03-24T17:34:58.848 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54849.log 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54613.log 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74995.log.gz 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48697.log 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54849.log.gz 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54613.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54613.log.gz 2026-03-24T17:34:58.849 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49514.log 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27197.log 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48697.log.gz 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68329.log 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49514.log.gz 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27197.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27197.log.gz 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37651.log 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68329.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38142.log 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.850 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78545.log 2026-03-24T17:34:58.851 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37651.log: /var/log/ceph/ceph-client.admin.38142.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66288.log 2026-03-24T17:34:58.851 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37651.log.gz 2026-03-24T17:34:58.851 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38142.log.gz 2026-03-24T17:34:58.851 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78545.log.gz 2026-03-24T17:34:58.851 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88867.log 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66288.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75987.log 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66288.log.gz 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88867.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88867.log.gz -5 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.49922.log 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41680.log 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75987.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75987.log.gz 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49922.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32722.log 2026-03-24T17:34:58.852 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49922.log.gz 2026-03-24T17:34:58.853 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41680.log.gz 2026-03-24T17:34:58.853 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37399.log 2026-03-24T17:34:58.853 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53204.log 2026-03-24T17:34:58.853 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32722.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58212.log 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37399.log: 1.2%/var/log/ceph/ceph-client.admin.53204.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.37399.log.gz 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.32722.log.gz 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53204.log.gz 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47349.log 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54957.log 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58212.log: /var/log/ceph/ceph-client.admin.47349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47349.log.gz 2026-03-24T17:34:58.854 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58212.log.gz 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38436.log 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54957.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36729.log 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54957.log.gz 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65888.log 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38436.log.gz 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36729.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47414.log 2026-03-24T17:34:58.855 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36729.log.gz 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65888.log.gz 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39835.log 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26373.log 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47414.log.gz 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39835.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39835.log.gz -5 2026-03-24T17:34:58.856 INFO:teuthology.orchestra.run.vm01.stderr: --verbose -- /var/log/ceph/ceph-client.admin.46271.log 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28450.log 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26373.log.gz 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46271.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31585.log 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46271.log.gz 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28450.log.gz 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70365.log 2026-03-24T17:34:58.857 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64398.log 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31585.log.gz 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70365.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56919.log 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70365.log.gz 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64398.log.gz 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64124.log 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33957.log 2026-03-24T17:34:58.858 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56919.log.gz 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64124.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48718.log 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64124.log.gz 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33957.log.gz 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72015.log 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82601.log 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48718.log.gz 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72015.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53561.log 2026-03-24T17:34:58.859 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72015.log.gz 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82601.log.gz 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40243.log 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40263.log 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53561.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37357.log 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40243.log: 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40243.log.gz 2026-03-24T17:34:58.860 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40263.log.gz 2026-03-24T17:34:58.861 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31922.log 2026-03-24T17:34:58.861 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30100.log 2026-03-24T17:34:58.861 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37357.log.gz 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30358.log 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31922.log: /var/log/ceph/ceph-client.admin.30100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30100.log.gz 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31922.log.gz 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48611.log 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30358.log.gz 2026-03-24T17:34:58.862 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44715.log 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33877.log 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48611.log.gz 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61911.log 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44715.log.gz 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33877.log.gz 2026-03-24T17:34:58.863 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63588.log 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30660.log 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61911.log: /var/log/ceph/ceph-client.admin.63588.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63588.log.gz 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.61911.log.gz 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83316.log 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46253.log 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30660.log.gz 2026-03-24T17:34:58.864 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83316.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83316.log.gz 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71993.log 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76261.log 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46253.log.gz 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71993.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75428.log 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71993.log.gz 2026-03-24T17:34:58.865 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76261.log.gz 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69485.log 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59294.log 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75428.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75428.log.gz 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69485.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32825.log 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69485.log.gz 2026-03-24T17:34:58.866 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59294.log.gz 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44694.log 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78893.log 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32825.log: /var/log/ceph/ceph-client.admin.44694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44694.log.gz 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73243.log 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32825.log.gz 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr:gzip/var/log/ceph/ceph-client.admin.78893.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.72574.log 2026-03-24T17:34:58.867 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78893.log.gz 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73243.log.gz 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53866.log 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70942.log 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72574.log: /var/log/ceph/ceph-client.admin.53866.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53866.log.gz 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.72574.log.gz 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28062.log 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29605.log 2026-03-24T17:34:58.868 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70942.log.gz 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28062.log.gz 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50201.log 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28789.log 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29605.log.gz 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50201.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82399.log 2026-03-24T17:34:58.869 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.50201.log.gz 2026-03-24T17:34:58.870 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28789.log.gz 2026-03-24T17:34:58.870 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.1.log 2026-03-24T17:34:58.870 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82399.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33417.log 2026-03-24T17:34:58.870 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82399.log.gz 2026-03-24T17:34:58.870 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76473.log 2026-03-24T17:34:58.871 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33417.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58838.log 2026-03-24T17:34:58.871 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33417.log.gz 2026-03-24T17:34:58.871 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76473.log.gz 2026-03-24T17:34:58.871 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68700.log 2026-03-24T17:34:58.871 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68925.log 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58838.log: /var/log/ceph/ceph-client.admin.68700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58838.log.gz 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68700.log.gz 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37630.log 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49643.log 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68925.log: /var/log/ceph/ceph-client.admin.37630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68925.log.gz 2026-03-24T17:34:58.872 INFO:teuthology.orchestra.run.vm01.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.37630.log.gz 2026-03-24T17:34:58.873 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71634.log 2026-03-24T17:34:58.873 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62273.log 2026-03-24T17:34:58.873 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49643.log: /var/log/ceph/ceph-client.admin.71634.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71634.log.gz 2026-03-24T17:34:58.873 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.49643.log.gz 2026-03-24T17:34:58.873 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31650.log 2026-03-24T17:34:58.874 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28083.log 2026-03-24T17:34:58.874 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62273.log: /var/log/ceph/ceph-client.admin.31650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62273.log.gz 2026-03-24T17:34:58.874 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31650.log.gz 2026-03-24T17:34:58.874 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47952.log 2026-03-24T17:34:58.874 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47113.log 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28083.log: /var/log/ceph/ceph-client.admin.47952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28083.log.gz 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47952.log.gz 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84811.log 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61528.log 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47113.log: /var/log/ceph/ceph-client.admin.84811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47113.log.gz 2026-03-24T17:34:58.875 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84811.log.gz 2026-03-24T17:34:58.876 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71607.log 2026-03-24T17:34:58.876 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84639.log 2026-03-24T17:34:58.876 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61528.log: /var/log/ceph/ceph-client.admin.71607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61528.log.gz 2026-03-24T17:34:58.876 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71607.log.gz 2026-03-24T17:34:58.876 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72768.log 2026-03-24T17:34:58.877 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62938.log 2026-03-24T17:34:58.877 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84639.log: /var/log/ceph/ceph-client.admin.72768.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84639.log.gz 2026-03-24T17:34:58.877 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72768.log.gz 2026-03-24T17:34:58.877 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71886.log 2026-03-24T17:34:58.877 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29648.log 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62938.log: /var/log/ceph/ceph-client.admin.71886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71886.log.gz 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.62938.log.gz 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30400.log 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71272.log 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29648.log: /var/log/ceph/ceph-client.admin.30400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29648.log.gz 2026-03-24T17:34:58.878 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30400.log.gz 2026-03-24T17:34:58.879 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81792.log 2026-03-24T17:34:58.879 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81512.log 2026-03-24T17:34:58.879 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71272.log: /var/log/ceph/ceph-client.admin.81792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71272.log.gz 2026-03-24T17:34:58.879 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81792.log.gz 2026-03-24T17:34:58.879 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43961.log 2026-03-24T17:34:58.880 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45231.log 2026-03-24T17:34:58.880 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81512.log: /var/log/ceph/ceph-client.admin.43961.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43961.log.gz 2026-03-24T17:34:58.880 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.81512.log.gz 2026-03-24T17:34:58.880 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35485.log 2026-03-24T17:34:58.880 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81448.log 2026-03-24T17:34:58.881 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45231.log: /var/log/ceph/ceph-client.admin.35485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45231.log.gz 2026-03-24T17:34:58.881 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35485.log.gz 2026-03-24T17:34:58.881 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72359.log 2026-03-24T17:34:58.881 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69743.log 2026-03-24T17:34:58.881 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72359.log: /var/log/ceph/ceph-client.admin.81448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72359.log.gz 2026-03-24T17:34:58.882 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81448.log.gz 2026-03-24T17:34:58.882 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81555.log 2026-03-24T17:34:58.882 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30984.log 2026-03-24T17:34:58.882 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69743.log: /var/log/ceph/ceph-client.admin.81555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69743.log.gz 2026-03-24T17:34:58.882 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81555.log.gz 2026-03-24T17:34:58.883 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42306.log 2026-03-24T17:34:58.883 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75017.log 2026-03-24T17:34:58.883 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30984.log: /var/log/ceph/ceph-client.admin.42306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30984.log.gz 2026-03-24T17:34:58.883 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42306.log.gz 2026-03-24T17:34:58.883 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61792.log 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31786.log 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75017.log: /var/log/ceph/ceph-client.admin.61792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75017.log.gz 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61792.log.gz 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35417.log 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80730.log 2026-03-24T17:34:58.884 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31786.log: /var/log/ceph/ceph-client.admin.35417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35417.log.gz 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31786.log.gz 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88260.log 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32433.log 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80730.log: /var/log/ceph/ceph-client.admin.88260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80730.log.gz 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88260.log.gz 2026-03-24T17:34:58.885 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40522.log 2026-03-24T17:34:58.886 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53927.log 2026-03-24T17:34:58.886 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32433.log: /var/log/ceph/ceph-client.admin.40522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40522.log.gz 2026-03-24T17:34:58.886 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32433.log.gz 2026-03-24T17:34:58.886 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50618.log 2026-03-24T17:34:58.886 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29928.log 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53927.log: /var/log/ceph/ceph-client.admin.50618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53927.log.gz 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50618.log.gz 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56112.log 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42427.log 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29928.log: /var/log/ceph/ceph-client.admin.56112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29928.log.gz 2026-03-24T17:34:58.887 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56112.log.gz 2026-03-24T17:34:58.888 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31414.log 2026-03-24T17:34:58.888 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57787.log 2026-03-24T17:34:58.888 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42427.log: /var/log/ceph/ceph-client.admin.31414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31414.log.gz 2026-03-24T17:34:58.888 INFO:teuthology.orchestra.run.vm01.stderr: 57.7% -- replaced with /var/log/ceph/ceph-client.admin.42427.log.gz 2026-03-24T17:34:58.888 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38226.log 2026-03-24T17:34:58.889 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61735.log 2026-03-24T17:34:58.889 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57787.log: /var/log/ceph/ceph-client.admin.38226.log: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.57787.log.gz 2026-03-24T17:34:58.889 INFO:teuthology.orchestra.run.vm01.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38226.log.gz 2026-03-24T17:34:58.889 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38352.log 2026-03-24T17:34:58.890 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61735.log.gz 2026-03-24T17:34:58.890 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78668.log 2026-03-24T17:34:58.890 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40109.log 2026-03-24T17:34:58.890 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38352.log: /var/log/ceph/ceph-client.admin.78668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78668.log.gz 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38352.log.gz 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64517.log 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63910.log 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40109.log: /var/log/ceph/ceph-client.admin.64517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40109.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64517.log.gz 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.891 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29369.log 2026-03-24T17:34:58.892 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86687.log 2026-03-24T17:34:58.892 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63910.log: /var/log/ceph/ceph-client.admin.29369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63910.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29369.log.gz 2026-03-24T17:34:58.892 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.892 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45513.log 2026-03-24T17:34:58.892 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57091.log 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86687.log: /var/log/ceph/ceph-client.admin.45513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86687.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45513.log.gz 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48182.log 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34037.log 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57091.log: /var/log/ceph/ceph-client.admin.48182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57091.log.gz 2026-03-24T17:34:58.893 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48182.log.gz 2026-03-24T17:34:58.894 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79520.log 2026-03-24T17:34:58.894 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62841.log 2026-03-24T17:34:58.894 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34037.log: /var/log/ceph/ceph-client.admin.79520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34037.log.gz 2026-03-24T17:34:58.894 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79520.log.gz 2026-03-24T17:34:58.894 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77705.log 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34939.log 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62841.log: /var/log/ceph/ceph-client.admin.77705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62841.log.gz 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77705.log.gz 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37441.log 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73574.log 2026-03-24T17:34:58.895 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34939.log: /var/log/ceph/ceph-client.admin.37441.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34939.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37441.log.gz 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66434.log 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67276.log 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73574.log: /var/log/ceph/ceph-client.admin.66434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73574.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66434.log.gz 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.896 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35239.log 2026-03-24T17:34:58.897 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86319.log 2026-03-24T17:34:58.897 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67276.log: /var/log/ceph/ceph-client.admin.35239.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67276.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35239.log.gz 2026-03-24T17:34:58.897 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.897 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64278.log 2026-03-24T17:34:58.898 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53182.log 2026-03-24T17:34:58.898 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86319.log: /var/log/ceph/ceph-client.admin.64278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86319.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64278.log.gz 2026-03-24T17:34:58.898 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.898 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46202.log 2026-03-24T17:34:58.898 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49061.log 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53182.log: /var/log/ceph/ceph-client.admin.46202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53182.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46202.log.gz 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49772.log 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55431.log 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49061.log: /var/log/ceph/ceph-client.admin.49772.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49772.log.gz 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.49061.log.gz 2026-03-24T17:34:58.899 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78507.log 2026-03-24T17:34:58.900 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37252.log 2026-03-24T17:34:58.900 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55431.log: /var/log/ceph/ceph-client.admin.78507.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78507.log.gz 2026-03-24T17:34:58.900 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.55431.log.gz 2026-03-24T17:34:58.900 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49449.log 2026-03-24T17:34:58.900 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45360.log 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37252.log: /var/log/ceph/ceph-client.admin.49449.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37252.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49449.log.gz 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60091.log 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61467.log 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45360.log: /var/log/ceph/ceph-client.admin.60091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60091.log.gz 2026-03-24T17:34:58.901 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45360.log.gz 2026-03-24T17:34:58.902 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29756.log 2026-03-24T17:34:58.902 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56175.log 2026-03-24T17:34:58.902 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61467.log: /var/log/ceph/ceph-client.admin.29756.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29756.log.gz 2026-03-24T17:34:58.902 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.61467.log.gz 2026-03-24T17:34:58.902 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59698.log 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46236.log 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56175.log: /var/log/ceph/ceph-client.admin.59698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56175.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59698.log.gz 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71295.log 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71252.log 2026-03-24T17:34:58.903 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46236.log: /var/log/ceph/ceph-client.admin.71295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46236.log.gz 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.71295.log.gz 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63731.log 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32092.log 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71252.log: /var/log/ceph/ceph-client.admin.63731.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63731.log.gz 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.71252.log.gz 2026-03-24T17:34:58.904 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76735.log 2026-03-24T17:34:58.905 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66017.log 2026-03-24T17:34:58.905 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32092.log: /var/log/ceph/ceph-client.admin.76735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76735.log.gz 2026-03-24T17:34:58.905 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32092.log.gz 2026-03-24T17:34:58.905 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90421.log 2026-03-24T17:34:58.906 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49083.log 2026-03-24T17:34:58.906 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66017.log: /var/log/ceph/ceph-client.admin.90421.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90421.log.gz 2026-03-24T17:34:58.906 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.66017.log.gz 2026-03-24T17:34:58.906 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32382.log 2026-03-24T17:34:58.906 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74300.log 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49083.log: /var/log/ceph/ceph-client.admin.32382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49083.log.gz 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32382.log.gz 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63928.log 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38100.log 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74300.log: /var/log/ceph/ceph-client.admin.63928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74300.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63928.log.gz 2026-03-24T17:34:58.907 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.908 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65479.log 2026-03-24T17:34:58.908 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78218.log 2026-03-24T17:34:58.908 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38100.log: /var/log/ceph/ceph-client.admin.65479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65479.log.gz 2026-03-24T17:34:58.908 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38100.log.gz 2026-03-24T17:34:58.908 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42589.log 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54092.log 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78218.log: /var/log/ceph/ceph-client.admin.42589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78218.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42589.log.gz 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37819.log 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71173.log 2026-03-24T17:34:58.909 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54092.log: /var/log/ceph/ceph-client.admin.37819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54092.log.gz 2026-03-24T17:34:58.910 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37819.log.gz 2026-03-24T17:34:58.910 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77379.log 2026-03-24T17:34:58.910 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27024.log 2026-03-24T17:34:58.910 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71173.log: /var/log/ceph/ceph-client.admin.77379.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77379.log.gz 2026-03-24T17:34:58.910 INFO:teuthology.orchestra.run.vm01.stderr: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.71173.log.gz 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38772.log 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45102.log 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27024.log: /var/log/ceph/ceph-client.admin.38772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27024.log.gz 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38772.log.gz 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39685.log 2026-03-24T17:34:58.911 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39534.log 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45102.log: /var/log/ceph/ceph-client.admin.39685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45102.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39685.log.gz 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60843.log 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88351.log 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39534.log: /var/log/ceph/ceph-client.admin.60843.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60843.log.gz 2026-03-24T17:34:58.912 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.39534.log.gz 2026-03-24T17:34:58.913 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81878.log 2026-03-24T17:34:58.913 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80275.log 2026-03-24T17:34:58.913 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88351.log: /var/log/ceph/ceph-client.admin.81878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81878.log.gz 2026-03-24T17:34:58.913 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88351.log.gz 2026-03-24T17:34:58.914 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88415.log 2026-03-24T17:34:58.914 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79756.log 2026-03-24T17:34:58.914 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80275.log: /var/log/ceph/ceph-client.admin.88415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80275.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88415.log.gz 2026-03-24T17:34:58.914 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.914 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60069.log 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40050.log 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79756.log: /var/log/ceph/ceph-client.admin.60069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79756.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60069.log.gz 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81942.log 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40029.log 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40050.log: /var/log/ceph/ceph-client.admin.81942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40050.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81942.log.gz 2026-03-24T17:34:58.915 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.916 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46898.log 2026-03-24T17:34:58.916 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38898.log 2026-03-24T17:34:58.916 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40029.log: /var/log/ceph/ceph-client.admin.46898.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46898.log.gz 2026-03-24T17:34:58.916 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.40029.log.gz 2026-03-24T17:34:58.916 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46179.log 2026-03-24T17:34:58.917 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33797.log 2026-03-24T17:34:58.917 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38898.log: /var/log/ceph/ceph-client.admin.46179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46179.log.gz 2026-03-24T17:34:58.917 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38898.log.gz 2026-03-24T17:34:58.917 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82456.log 2026-03-24T17:34:58.917 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27026.log 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33797.log: /var/log/ceph/ceph-client.admin.82456.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82456.log.gz 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.33797.log.gz 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40165.log 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83423.log 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27026.log: /var/log/ceph/ceph-client.admin.40165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27026.log.gz 2026-03-24T17:34:58.918 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.40165.log.gz 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83982.log 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62898.log 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83423.log: /var/log/ceph/ceph-client.admin.83982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83423.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83982.log.gz 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76137.log 2026-03-24T17:34:58.919 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36950.log 2026-03-24T17:34:58.920 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62898.log: /var/log/ceph/ceph-client.admin.76137.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76137.log.gz 2026-03-24T17:34:58.920 INFO:teuthology.orchestra.run.vm01.stderr: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.62898.log.gz 2026-03-24T17:34:58.920 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54072.log 2026-03-24T17:34:58.920 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44844.log 2026-03-24T17:34:58.920 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36950.log: /var/log/ceph/ceph-client.admin.54072.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54072.log.gz 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.36950.log.gz 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27911.log 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50051.log 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44844.log: /var/log/ceph/ceph-client.admin.27911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44844.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27911.log.gz 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.921 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75296.log 2026-03-24T17:34:58.922 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27217.log 2026-03-24T17:34:58.922 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50051.log: /var/log/ceph/ceph-client.admin.75296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50051.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75296.log.gz 2026-03-24T17:34:58.922 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.922 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81469.log 2026-03-24T17:34:58.922 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36865.log 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27217.log: /var/log/ceph/ceph-client.admin.81469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27217.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81469.log.gz 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50640.log 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79931.log 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36865.log: /var/log/ceph/ceph-client.admin.50640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36865.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50640.log.gz 2026-03-24T17:34:58.923 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50073.log 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69122.log 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79931.log: /var/log/ceph/ceph-client.admin.50073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79931.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50073.log.gz 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52688.log 2026-03-24T17:34:58.924 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58406.log 2026-03-24T17:34:58.925 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69122.log: /var/log/ceph/ceph-client.admin.52688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52688.log.gz 2026-03-24T17:34:58.925 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.69122.log.gz 2026-03-24T17:34:58.925 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60650.log 2026-03-24T17:34:58.925 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83182.log 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58406.log: /var/log/ceph/ceph-client.admin.60650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60650.log.gz 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58406.log.gz 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67878.log 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42269.log 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83182.log: /var/log/ceph/ceph-client.admin.67878.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67878.log.gz 2026-03-24T17:34:58.926 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.83182.log.gz 2026-03-24T17:34:58.927 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57559.log 2026-03-24T17:34:58.927 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63771.log 2026-03-24T17:34:58.927 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42269.log: /var/log/ceph/ceph-client.admin.57559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57559.log.gz 2026-03-24T17:34:58.927 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.42269.log.gz 2026-03-24T17:34:58.927 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36831.log 2026-03-24T17:34:58.928 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42248.log 2026-03-24T17:34:58.928 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63771.log: /var/log/ceph/ceph-client.admin.36831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36831.log.gz 2026-03-24T17:34:58.928 INFO:teuthology.orchestra.run.vm01.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.63771.log.gz 2026-03-24T17:34:58.928 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74578.log 2026-03-24T17:34:58.928 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62861.log 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42248.log: /var/log/ceph/ceph-client.admin.74578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74578.log.gz 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr: 54.0% -- replaced with /var/log/ceph/ceph-client.admin.42248.log.gz 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89727.log 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38919.log 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62861.log: /var/log/ceph/ceph-client.admin.89727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62861.log.gz 2026-03-24T17:34:58.929 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89727.log.gz 2026-03-24T17:34:58.930 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28471.log 2026-03-24T17:34:58.930 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70959.log 2026-03-24T17:34:58.930 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38919.log: /var/log/ceph/ceph-client.admin.28471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28471.log.gz 2026-03-24T17:34:58.930 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38919.log.gz 2026-03-24T17:34:58.930 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41085.log 2026-03-24T17:34:58.931 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62784.log 2026-03-24T17:34:58.931 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70959.log: /var/log/ceph/ceph-client.admin.41085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70959.log.gz 2026-03-24T17:34:58.931 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41085.log.gz 2026-03-24T17:34:58.931 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36576.log 2026-03-24T17:34:58.931 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70601.log 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62784.log: /var/log/ceph/ceph-client.admin.36576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62784.log.gz 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36576.log.gz 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64499.log 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34839.log 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70601.log: /var/log/ceph/ceph-client.admin.64499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70601.log.gz 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64499.log.gz 2026-03-24T17:34:58.932 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61680.log 2026-03-24T17:34:58.933 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72552.log 2026-03-24T17:34:58.933 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34839.log: /var/log/ceph/ceph-client.admin.61680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34839.log.gz 0.0% 2026-03-24T17:34:58.933 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.61680.log.gz 2026-03-24T17:34:58.933 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65909.log 2026-03-24T17:34:58.933 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79348.log 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72552.log: /var/log/ceph/ceph-client.admin.65909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72552.log.gz 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65909.log.gz 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73942.log 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39024.log 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79348.log: /var/log/ceph/ceph-client.admin.73942.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73942.log.gz 2026-03-24T17:34:58.934 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.79348.log.gz 2026-03-24T17:34:58.935 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50094.log 2026-03-24T17:34:58.935 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32365.log 2026-03-24T17:34:58.935 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39024.log: /var/log/ceph/ceph-client.admin.50094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50094.log.gz 2026-03-24T17:34:58.935 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.39024.log.gz 2026-03-24T17:34:58.935 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86141.log 2026-03-24T17:34:58.936 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77122.log 2026-03-24T17:34:58.936 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32365.log: /var/log/ceph/ceph-client.admin.86141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86141.log.gz 2026-03-24T17:34:58.936 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32365.log.gz 2026-03-24T17:34:58.936 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57220.log 2026-03-24T17:34:58.936 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80704.log 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77122.log: /var/log/ceph/ceph-client.admin.57220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77122.log.gz 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57220.log.gz 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79240.log 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78966.log 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79240.log.gz 2026-03-24T17:34:58.937 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80704.log.gz 2026-03-24T17:34:58.938 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48933.log 2026-03-24T17:34:58.938 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30164.log 2026-03-24T17:34:58.938 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78966.log: /var/log/ceph/ceph-client.admin.48933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78966.log.gz 2026-03-24T17:34:58.938 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48933.log.gz 2026-03-24T17:34:58.938 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46448.log 2026-03-24T17:34:58.939 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79434.log 2026-03-24T17:34:58.939 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30164.log: /var/log/ceph/ceph-client.admin.46448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30164.log.gz 2026-03-24T17:34:58.939 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46448.log.gz 2026-03-24T17:34:58.939 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45664.log 2026-03-24T17:34:58.939 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53639.log 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79434.log: /var/log/ceph/ceph-client.admin.45664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79434.log.gz 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45664.log.gz 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36559.log 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67792.log 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53639.log: /var/log/ceph/ceph-client.admin.36559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53639.log.gz 2026-03-24T17:34:58.940 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36559.log.gz 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53225.log 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39087.log 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67792.log: /var/log/ceph/ceph-client.admin.53225.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53225.log.gz 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.67792.log.gz 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68532.log 2026-03-24T17:34:58.941 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87457.log 2026-03-24T17:34:58.942 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39087.log: /var/log/ceph/ceph-client.admin.68532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68532.log.gz 2026-03-24T17:34:58.942 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.39087.log.gz 2026-03-24T17:34:58.942 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75729.log 2026-03-24T17:34:58.942 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40543.log 2026-03-24T17:34:58.942 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87457.log: /var/log/ceph/ceph-client.admin.75729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87457.log.gz 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75729.log.gz 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61295.log 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53075.log 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40543.log: /var/log/ceph/ceph-client.admin.61295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61295.log.gz 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr: 66.9% -- replaced with /var/log/ceph/ceph-client.admin.40543.log.gz 2026-03-24T17:34:58.943 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56341.log 2026-03-24T17:34:58.944 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30898.log 2026-03-24T17:34:58.944 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53075.log: /var/log/ceph/ceph-client.admin.56341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53075.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56341.log.gz 2026-03-24T17:34:58.944 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.944 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42347.log 2026-03-24T17:34:58.944 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54442.log 2026-03-24T17:34:58.945 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30898.log: /var/log/ceph/ceph-client.admin.42347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30898.log.gz 2026-03-24T17:34:58.945 INFO:teuthology.orchestra.run.vm01.stderr: 55.8% -- replaced with /var/log/ceph/ceph-client.admin.42347.log.gz 2026-03-24T17:34:58.945 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82881.log 2026-03-24T17:34:58.945 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41721.log 2026-03-24T17:34:58.946 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54442.log: /var/log/ceph/ceph-client.admin.82881.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82881.log.gz 2026-03-24T17:34:58.946 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.54442.log.gz 2026-03-24T17:34:58.946 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80038.log 2026-03-24T17:34:58.946 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65587.log 2026-03-24T17:34:58.946 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41721.log: /var/log/ceph/ceph-client.admin.80038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80038.log.gz 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.41721.log.gz 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68598.log 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82666.log 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65587.log: /var/log/ceph/ceph-client.admin.68598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65587.log.gz 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.68598.log.gz 2026-03-24T17:34:58.947 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28731.log 2026-03-24T17:34:58.948 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27045.log 2026-03-24T17:34:58.948 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82666.log: /var/log/ceph/ceph-client.admin.28731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82666.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28731.log.gz 2026-03-24T17:34:58.948 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.948 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73844.log 2026-03-24T17:34:58.948 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65845.log 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27045.log: /var/log/ceph/ceph-client.admin.73844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73844.log.gz 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27045.log.gz 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68286.log 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89512.log 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65845.log: /var/log/ceph/ceph-client.admin.68286.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68286.log.gz 2026-03-24T17:34:58.949 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65845.log.gz 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64218.log 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58783.log 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89512.log: /var/log/ceph/ceph-client.admin.64218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89512.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64218.log.gz 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30470.log 2026-03-24T17:34:58.950 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58255.log 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58783.log: /var/log/ceph/ceph-client.admin.30470.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30470.log.gz 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.58783.log.gz 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66060.log 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66248.log 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58255.log: /var/log/ceph/ceph-client.admin.66060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58255.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66060.log.gz 2026-03-24T17:34:58.951 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.952 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68007.log 2026-03-24T17:34:58.952 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82475.log 2026-03-24T17:34:58.952 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66248.log: /var/log/ceph/ceph-client.admin.68007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66248.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68007.log.gz 2026-03-24T17:34:58.952 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.952 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48013.log 2026-03-24T17:34:58.953 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59616.log 2026-03-24T17:34:58.953 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82475.log: /var/log/ceph/ceph-client.admin.48013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82475.log.gz 0.0% 2026-03-24T17:34:58.953 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.48013.log.gz 2026-03-24T17:34:58.953 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56236.log 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57869.log 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59616.log: /var/log/ceph/ceph-client.admin.56236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59616.log.gz 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56236.log.gz 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45642.log 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32688.log 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57869.log: /var/log/ceph/ceph-client.admin.45642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57869.log.gz 2026-03-24T17:34:58.954 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45642.log.gz 2026-03-24T17:34:58.955 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41042.log 2026-03-24T17:34:58.955 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77337.log 2026-03-24T17:34:58.955 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32688.log: /var/log/ceph/ceph-client.admin.41042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41042.log.gz 2026-03-24T17:34:58.955 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32688.log.gz 2026-03-24T17:34:58.955 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74406.log 2026-03-24T17:34:58.956 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39964.log 2026-03-24T17:34:58.956 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77337.log: /var/log/ceph/ceph-client.admin.74406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77337.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74406.log.gz 2026-03-24T17:34:58.956 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.956 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34999.log 2026-03-24T17:34:58.956 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70472.log 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39964.log: /var/log/ceph/ceph-client.admin.34999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39964.log.gz 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34999.log.gz 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36712.log 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79670.log 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70472.log: /var/log/ceph/ceph-client.admin.36712.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36712.log.gz 2026-03-24T17:34:58.957 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70472.log.gz 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33637.log 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56962.log 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79670.log: /var/log/ceph/ceph-client.admin.33637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79670.log.gz 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33637.log.gz 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89877.log 2026-03-24T17:34:58.958 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37798.log 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56962.log: /var/log/ceph/ceph-client.admin.89877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56962.log.gz 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89877.log.gz 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28040.log 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56510.log 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37798.log: /var/log/ceph/ceph-client.admin.28040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37798.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28040.log.gz 2026-03-24T17:34:58.959 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.960 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73716.log 2026-03-24T17:34:58.960 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37588.log 2026-03-24T17:34:58.960 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56510.log: /var/log/ceph/ceph-client.admin.73716.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73716.log.gz 2026-03-24T17:34:58.960 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.56510.log.gz 2026-03-24T17:34:58.960 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29305.log 2026-03-24T17:34:58.961 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83810.log 2026-03-24T17:34:58.961 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37588.log: /var/log/ceph/ceph-client.admin.29305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29305.log.gz 2026-03-24T17:34:58.961 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37588.log.gz 2026-03-24T17:34:58.961 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46396.log 2026-03-24T17:34:58.961 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72380.log 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83810.log: /var/log/ceph/ceph-client.admin.46396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83810.log.gz 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46396.log.gz 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50137.log 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47932.log 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72380.log: /var/log/ceph/ceph-client.admin.50137.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72380.log.gz 2026-03-24T17:34:58.962 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50137.log.gz 2026-03-24T17:34:58.963 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83789.log 2026-03-24T17:34:58.963 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22712.log 2026-03-24T17:34:58.963 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47932.log: /var/log/ceph/ceph-client.admin.83789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47932.log.gz 2026-03-24T17:34:58.963 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83789.log.gz 2026-03-24T17:34:58.963 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53053.log 2026-03-24T17:34:58.964 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76367.log 2026-03-24T17:34:58.964 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.22712.log: /var/log/ceph/ceph-client.admin.53053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22712.log.gz 2026-03-24T17:34:58.964 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53053.log.gz 2026-03-24T17:34:58.964 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30941.log 2026-03-24T17:34:58.964 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36678.log 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76367.log: /var/log/ceph/ceph-client.admin.30941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76367.log.gz 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30941.log.gz 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26869.log 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28256.log 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36678.log: /var/log/ceph/ceph-client.admin.26869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36678.log.gz 2026-03-24T17:34:58.965 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26869.log.gz 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76907.log 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65759.log 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28256.log: /var/log/ceph/ceph-client.admin.76907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28256.log.gz 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76907.log.gz 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44801.log 2026-03-24T17:34:58.966 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65931.log 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65759.log: /var/log/ceph/ceph-client.admin.44801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65759.log.gz 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44801.log.gz 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49040.log 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47457.log 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65931.log: /var/log/ceph/ceph-client.admin.49040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65931.log.gz 2026-03-24T17:34:58.967 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49040.log.gz 2026-03-24T17:34:58.968 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85480.log 2026-03-24T17:34:58.968 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66784.log 2026-03-24T17:34:58.968 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47457.log: /var/log/ceph/ceph-client.admin.85480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47457.log.gz 2026-03-24T17:34:58.968 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85480.log.gz 2026-03-24T17:34:58.968 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36235.log 2026-03-24T17:34:58.969 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80404.log 2026-03-24T17:34:58.969 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66784.log: /var/log/ceph/ceph-client.admin.36235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66784.log.gz 2026-03-24T17:34:58.969 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36235.log.gz 2026-03-24T17:34:58.969 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73609.log 2026-03-24T17:34:58.969 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36746.log 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80404.log: /var/log/ceph/ceph-client.admin.73609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80404.log.gz 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73609.log.gz 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80447.log 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27369.log 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36746.log: /var/log/ceph/ceph-client.admin.80447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36746.log.gz 2026-03-24T17:34:58.970 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80447.log.gz 2026-03-24T17:34:58.971 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69442.log 2026-03-24T17:34:58.971 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57621.log 2026-03-24T17:34:58.971 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27369.log: /var/log/ceph/ceph-client.admin.69442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27369.log.gz 2026-03-24T17:34:58.971 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69442.log.gz 2026-03-24T17:34:58.971 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60005.log 2026-03-24T17:34:58.972 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43521.log 2026-03-24T17:34:58.972 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57621.log: /var/log/ceph/ceph-client.admin.60005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57621.log.gz 2026-03-24T17:34:58.972 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60005.log.gz 2026-03-24T17:34:58.972 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30797.log 2026-03-24T17:34:58.972 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47134.log 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43521.log: /var/log/ceph/ceph-client.admin.30797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43521.log.gz 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30797.log.gz 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62294.log 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37104.log 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47134.log: /var/log/ceph/ceph-client.admin.62294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47134.log.gz 2026-03-24T17:34:58.973 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62294.log.gz 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58492.log 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66328.log 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37104.log: /var/log/ceph/ceph-client.admin.58492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37104.log.gz 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58492.log.gz 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82109.log 2026-03-24T17:34:58.974 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27293.log 2026-03-24T17:34:58.975 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66328.log: /var/log/ceph/ceph-client.admin.82109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82109.log.gz 2026-03-24T17:34:58.975 INFO:teuthology.orchestra.run.vm01.stderr: 53.0% -- replaced with /var/log/ceph/ceph-client.admin.66328.log.gz 2026-03-24T17:34:58.975 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84166.log 2026-03-24T17:34:58.975 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56876.log 2026-03-24T17:34:58.975 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27293.log: /var/log/ceph/ceph-client.admin.84166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27293.log.gz 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84166.log.gz 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71793.log 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73437.log 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56876.log: /var/log/ceph/ceph-client.admin.71793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56876.log.gz 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.71793.log.gz 2026-03-24T17:34:58.976 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35996.log 2026-03-24T17:34:58.977 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29519.log 2026-03-24T17:34:58.977 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73437.log: /var/log/ceph/ceph-client.admin.35996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73437.log.gz 2026-03-24T17:34:58.977 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35996.log.gz 2026-03-24T17:34:58.977 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73061.log 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77654.log 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29519.log: /var/log/ceph/ceph-client.admin.73061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29519.log.gz 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73061.log.gz 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44444.log 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78199.log 2026-03-24T17:34:58.978 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77654.log: /var/log/ceph/ceph-client.admin.44444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77654.log.gz 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44444.log.gz 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69936.log 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22910.log 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78199.log: /var/log/ceph/ceph-client.admin.69936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78199.log.gz 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69936.log.gz 2026-03-24T17:34:58.979 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44302.log 2026-03-24T17:34:58.980 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78944.log 2026-03-24T17:34:58.980 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.22910.log: /var/log/ceph/ceph-client.admin.44302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22910.log.gz 2026-03-24T17:34:58.980 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44302.log.gz 2026-03-24T17:34:58.980 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25563.log 2026-03-24T17:34:58.980 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49492.log 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78944.log: /var/log/ceph/ceph-client.admin.25563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25563.log.gz 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr: 55.6% -- replaced with /var/log/ceph/ceph-client.admin.78944.log.gz 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26395.log 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29777.log 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49492.log: /var/log/ceph/ceph-client.admin.26395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49492.log.gz 2026-03-24T17:34:58.981 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26395.log.gz 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86300.log 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88302.log 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29777.log: /var/log/ceph/ceph-client.admin.86300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29777.log.gz 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86300.log.gz 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37087.log 2026-03-24T17:34:58.982 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35758.log 2026-03-24T17:34:58.983 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88302.log: /var/log/ceph/ceph-client.admin.37087.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88302.log.gz 2026-03-24T17:34:58.983 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37087.log.gz 2026-03-24T17:34:58.983 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31973.log 2026-03-24T17:34:58.983 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72638.log 2026-03-24T17:34:58.983 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35758.log: /var/log/ceph/ceph-client.admin.31973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35758.log.gz 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31973.log.gz 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54720.log 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71831.log 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72638.log: /var/log/ceph/ceph-client.admin.54720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72638.log.gz 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.54720.log.gz 2026-03-24T17:34:58.984 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40784.log 2026-03-24T17:34:58.985 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71296.log 2026-03-24T17:34:58.985 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71831.log: /var/log/ceph/ceph-client.admin.40784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71831.log.gz 2026-03-24T17:34:58.985 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40784.log.gz 2026-03-24T17:34:58.985 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33373.log 2026-03-24T17:34:58.985 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44737.log 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71296.log: /var/log/ceph/ceph-client.admin.33373.log: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.71296.log.gz 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.33373.log.gz 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34357.log 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70976.log 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44737.log: /var/log/ceph/ceph-client.admin.34357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44737.log.gz 2026-03-24T17:34:58.986 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34357.log.gz 2026-03-24T17:34:58.987 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68965.log 2026-03-24T17:34:58.987 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56575.log 2026-03-24T17:34:58.987 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70976.log: /var/log/ceph/ceph-client.admin.68965.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68965.log.gz 2026-03-24T17:34:58.987 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70976.log.gz 2026-03-24T17:34:58.987 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75275.log 2026-03-24T17:34:58.988 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53722.log 2026-03-24T17:34:58.988 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56575.log: /var/log/ceph/ceph-client.admin.75275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56575.log.gz 2026-03-24T17:34:58.988 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75275.log.gz 2026-03-24T17:34:58.988 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79821.log 2026-03-24T17:34:58.988 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63451.log 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53722.log: /var/log/ceph/ceph-client.admin.79821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53722.log.gz 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79821.log.gz 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61101.log 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52731.log 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63451.log: /var/log/ceph/ceph-client.admin.61101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63451.log.gz 2026-03-24T17:34:58.989 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61101.log.gz 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75944.log 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73515.log 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52731.log: /var/log/ceph/ceph-client.admin.75944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52731.log.gz 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75944.log.gz 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29949.log 2026-03-24T17:34:58.990 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46072.log 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73515.log: /var/log/ceph/ceph-client.admin.29949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73515.log.gz 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29949.log.gz 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66996.log 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86708.log 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46072.log: /var/log/ceph/ceph-client.admin.66996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46072.log.gz 2026-03-24T17:34:58.991 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66996.log.gz 2026-03-24T17:34:58.992 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47091.log 2026-03-24T17:34:58.992 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59550.log 2026-03-24T17:34:58.992 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86708.log: /var/log/ceph/ceph-client.admin.47091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47091.log.gz 2026-03-24T17:34:58.992 INFO:teuthology.orchestra.run.vm01.stderr: 26.9% -- replaced with /var/log/ceph/ceph-client.admin.86708.log.gz 2026-03-24T17:34:58.992 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36132.log 2026-03-24T17:34:58.993 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47588.log 2026-03-24T17:34:58.993 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59550.log: /var/log/ceph/ceph-client.admin.36132.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36132.log.gz 2026-03-24T17:34:58.993 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.59550.log.gz 2026-03-24T17:34:58.993 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61892.log 2026-03-24T17:34:58.994 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85652.log 2026-03-24T17:34:58.994 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47588.log: /var/log/ceph/ceph-client.admin.61892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47588.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61892.log.gz 2026-03-24T17:34:58.994 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.994 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33166.log 2026-03-24T17:34:58.994 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45836.log 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85652.log: /var/log/ceph/ceph-client.admin.33166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85652.log.gz 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33166.log.gz 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88544.log 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47177.log 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45836.log: /var/log/ceph/ceph-client.admin.88544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45836.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88544.log.gz 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.995 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62086.log 2026-03-24T17:34:58.996 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28769.log 2026-03-24T17:34:58.996 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47177.log: /var/log/ceph/ceph-client.admin.62086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47177.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62086.log.gz 2026-03-24T17:34:58.996 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.996 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51220.log 2026-03-24T17:34:58.996 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62527.log 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28769.log: /var/log/ceph/ceph-client.admin.51220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28769.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51220.log.gz 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71211.log 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40421.log 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62527.log: /var/log/ceph/ceph-client.admin.71211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62527.log.gz 2026-03-24T17:34:58.997 INFO:teuthology.orchestra.run.vm01.stderr: 91.5% -- replaced with /var/log/ceph/ceph-client.admin.71211.log.gz 2026-03-24T17:34:58.998 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77739.log 2026-03-24T17:34:58.998 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75232.log 2026-03-24T17:34:58.998 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40421.log: /var/log/ceph/ceph-client.admin.77739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77739.log.gz 2026-03-24T17:34:58.998 INFO:teuthology.orchestra.run.vm01.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.40421.log.gz 2026-03-24T17:34:58.998 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56983.log 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70773.log 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75232.log: /var/log/ceph/ceph-client.admin.56983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75232.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56983.log.gz 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67620.log 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30417.log 2026-03-24T17:34:58.999 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70773.log: /var/log/ceph/ceph-client.admin.67620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70773.log.gz 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67620.log.gz 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28940.log 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28321.log 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30417.log: /var/log/ceph/ceph-client.admin.28940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28940.log.gz 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30417.log.gz 2026-03-24T17:34:59.000 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42367.log 2026-03-24T17:34:59.001 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56070.log 2026-03-24T17:34:59.001 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28321.log: /var/log/ceph/ceph-client.admin.42367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28321.log.gz 0.0% 2026-03-24T17:34:59.001 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.42367.log.gz 2026-03-24T17:34:59.001 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64064.log 2026-03-24T17:34:59.002 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63018.log 2026-03-24T17:34:59.002 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56070.log: /var/log/ceph/ceph-client.admin.64064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56070.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64064.log.gz 2026-03-24T17:34:59.002 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.002 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30121.log 2026-03-24T17:34:59.002 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69083.log 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63018.log: /var/log/ceph/ceph-client.admin.30121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63018.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30121.log.gz 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68071.log 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34577.log 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69083.log: /var/log/ceph/ceph-client.admin.68071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68071.log.gz 2026-03-24T17:34:59.003 INFO:teuthology.orchestra.run.vm01.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.69083.log.gz 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75146.log 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41949.log 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34577.log: /var/log/ceph/ceph-client.admin.75146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34577.log.gz 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75146.log.gz 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76993.log 2026-03-24T17:34:59.004 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81856.log 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41949.log: /var/log/ceph/ceph-client.admin.76993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41949.log.gz 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76993.log.gz 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40891.log 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83291.log 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81856.log: /var/log/ceph/ceph-client.admin.40891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81856.log.gz 2026-03-24T17:34:59.005 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40891.log.gz 2026-03-24T17:34:59.006 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76606.log 2026-03-24T17:34:59.006 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72230.log 2026-03-24T17:34:59.006 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83291.log: /var/log/ceph/ceph-client.admin.76606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83291.log.gz 2026-03-24T17:34:59.006 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76606.log.gz 2026-03-24T17:34:59.006 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84047.log 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83617.log 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72230.log: /var/log/ceph/ceph-client.admin.84047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72230.log.gz 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84047.log.gz 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31854.log 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76885.log 2026-03-24T17:34:59.007 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83617.log: /var/log/ceph/ceph-client.admin.31854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83617.log.gz 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr: 3.8% -- replaced with /var/log/ceph/ceph-client.admin.31854.log.gz 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39003.log 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88458.log 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76885.log: /var/log/ceph/ceph-client.admin.39003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76885.log.gz 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.39003.log.gz 2026-03-24T17:34:59.008 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87357.log 2026-03-24T17:34:59.009 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66268.log 2026-03-24T17:34:59.009 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88458.log: /var/log/ceph/ceph-client.admin.87357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88458.log.gz 2026-03-24T17:34:59.009 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87357.log.gz 2026-03-24T17:34:59.009 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74372.log 2026-03-24T17:34:59.010 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43261.log 2026-03-24T17:34:59.010 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66268.log: /var/log/ceph/ceph-client.admin.74372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66268.log.gz 2026-03-24T17:34:59.010 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74372.log.gz 2026-03-24T17:34:59.010 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86773.log 2026-03-24T17:34:59.010 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75382.log 2026-03-24T17:34:59.011 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43261.log: /var/log/ceph/ceph-client.admin.86773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86773.log.gz 2026-03-24T17:34:59.011 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.43261.log.gz 2026-03-24T17:34:59.011 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32671.log 2026-03-24T17:34:59.011 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35059.log 2026-03-24T17:34:59.011 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75382.log: /var/log/ceph/ceph-client.admin.32671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75382.log.gz 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32671.log.gz 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80640.log 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55150.log 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35059.log: /var/log/ceph/ceph-client.admin.80640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35059.log.gz 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80640.log.gz 2026-03-24T17:34:59.012 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34717.log 2026-03-24T17:34:59.013 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46730.log 2026-03-24T17:34:59.013 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55150.log: /var/log/ceph/ceph-client.admin.34717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55150.log.gz 2026-03-24T17:34:59.013 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34717.log.gz 2026-03-24T17:34:59.013 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64238.log 2026-03-24T17:34:59.013 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29047.log 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46730.log: /var/log/ceph/ceph-client.admin.64238.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46730.log.gz 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64238.log.gz 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77573.log 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86098.log 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29047.log.gz/var/log/ceph/ceph-client.admin.77573.log: 2026-03-24T17:34:59.014 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77573.log.gz 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54699.log 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78526.log 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86098.log: /var/log/ceph/ceph-client.admin.54699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86098.log.gz 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54699.log.gz 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81191.log 2026-03-24T17:34:59.015 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43605.log 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78526.log: /var/log/ceph/ceph-client.admin.81191.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81191.log.gz 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.78526.log.gz 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49793.log 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75557.log 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43605.log: /var/log/ceph/ceph-client.admin.49793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43605.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49793.log.gz 2026-03-24T17:34:59.016 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.017 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83267.log 2026-03-24T17:34:59.017 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58513.log 2026-03-24T17:34:59.017 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75557.log: /var/log/ceph/ceph-client.admin.83267.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75557.log.gz 2026-03-24T17:34:59.017 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83267.log.gz 2026-03-24T17:34:59.018 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80167.log 2026-03-24T17:34:59.018 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56092.log 2026-03-24T17:34:59.018 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58513.log: /var/log/ceph/ceph-client.admin.80167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58513.log.gz 2026-03-24T17:34:59.018 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80167.log.gz 2026-03-24T17:34:59.018 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74802.log 2026-03-24T17:34:59.019 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73699.log 2026-03-24T17:34:59.019 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56092.log: /var/log/ceph/ceph-client.admin.74802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56092.log.gz 2026-03-24T17:34:59.019 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74802.log.gz 2026-03-24T17:34:59.019 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73026.log 2026-03-24T17:34:59.019 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47242.log 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73699.log: /var/log/ceph/ceph-client.admin.73026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73699.log.gz 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73026.log.gz 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83053.log 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27000.log 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47242.log: /var/log/ceph/ceph-client.admin.83053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47242.log.gz 2026-03-24T17:34:59.020 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83053.log.gz 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62627.log 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28829.log 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27000.log: /var/log/ceph/ceph-client.admin.62627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27000.log.gz 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62627.log.gz 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84703.log 2026-03-24T17:34:59.021 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86880.log 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28829.log: /var/log/ceph/ceph-client.admin.84703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28829.log.gz 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84703.log.gz 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65165.log 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45081.log 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86880.log: /var/log/ceph/ceph-client.admin.65165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86880.log.gz 2026-03-24T17:34:59.022 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65165.log.gz 2026-03-24T17:34:59.023 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26072.log 2026-03-24T17:34:59.023 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71133.log 2026-03-24T17:34:59.023 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45081.log: /var/log/ceph/ceph-client.admin.26072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45081.log.gz 2026-03-24T17:34:59.023 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26072.log.gz 2026-03-24T17:34:59.023 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68351.log 2026-03-24T17:34:59.024 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65565.log 2026-03-24T17:34:59.024 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71133.log: /var/log/ceph/ceph-client.admin.68351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68351.log.gz 2026-03-24T17:34:59.024 INFO:teuthology.orchestra.run.vm01.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.71133.log.gz 2026-03-24T17:34:59.024 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47870.log 2026-03-24T17:34:59.024 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82260.log 2026-03-24T17:34:59.025 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65565.log: /var/log/ceph/ceph-client.admin.47870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65565.log.gz 2026-03-24T17:34:59.025 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47870.log.gz 2026-03-24T17:34:59.025 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64084.log 2026-03-24T17:34:59.025 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44323.log 2026-03-24T17:34:59.025 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82260.log: /var/log/ceph/ceph-client.admin.64084.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64084.log.gz 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.82260.log.gz 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40827.log 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68050.log 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44323.log: /var/log/ceph/ceph-client.admin.40827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44323.log.gz 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40827.log.gz 2026-03-24T17:34:59.026 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32995.log 2026-03-24T17:34:59.027 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60435.log 2026-03-24T17:34:59.027 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68050.log: /var/log/ceph/ceph-client.admin.32995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68050.log.gz 2026-03-24T17:34:59.027 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32995.log.gz 2026-03-24T17:34:59.027 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46378.log 2026-03-24T17:34:59.027 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39986.log 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60435.log: /var/log/ceph/ceph-client.admin.46378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60435.log.gz 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46378.log.gz 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58234.log 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78391.log 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39986.log: /var/log/ceph/ceph-client.admin.58234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39986.log.gz 2026-03-24T17:34:59.028 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58234.log.gz 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50726.log 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27159.log 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78391.log: /var/log/ceph/ceph-client.admin.50726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78391.log.gz 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50726.log.gz 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65243.log 2026-03-24T17:34:59.029 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58685.log 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27159.log: /var/log/ceph/ceph-client.admin.65243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65243.log.gz 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27159.log.gz 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71556.log 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80468.log 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58685.log: /var/log/ceph/ceph-client.admin.71556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58685.log.gz 2026-03-24T17:34:59.030 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71556.log.gz 2026-03-24T17:34:59.031 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37210.log 2026-03-24T17:34:59.031 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26696.log 2026-03-24T17:34:59.031 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80468.log: /var/log/ceph/ceph-client.admin.37210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80468.log.gz 2026-03-24T17:34:59.031 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37210.log.gz 2026-03-24T17:34:59.031 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42028.log 2026-03-24T17:34:59.032 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79326.log 2026-03-24T17:34:59.032 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26696.log: /var/log/ceph/ceph-client.admin.42028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26696.log.gz 2026-03-24T17:34:59.032 INFO:teuthology.orchestra.run.vm01.stderr: 17.8% -- replaced with /var/log/ceph/ceph-client.admin.42028.log.gz 2026-03-24T17:34:59.032 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42756.log 2026-03-24T17:34:59.032 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86945.log 2026-03-24T17:34:59.033 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79326.log: /var/log/ceph/ceph-client.admin.42756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79326.log.gz 2026-03-24T17:34:59.033 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42756.log.gz 2026-03-24T17:34:59.033 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35621.log 2026-03-24T17:34:59.033 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62433.log 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86945.log: /var/log/ceph/ceph-client.admin.35621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86945.log.gz 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35621.log.gz 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55517.log 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42007.log 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62433.log: /var/log/ceph/ceph-client.admin.55517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62433.log.gz 2026-03-24T17:34:59.034 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55517.log.gz 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29670.log 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66889.log 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42007.log: /var/log/ceph/ceph-client.admin.29670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42007.log.gz 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29670.log.gz 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57519.log 2026-03-24T17:34:59.035 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55796.log 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66889.log: /var/log/ceph/ceph-client.admin.57519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66889.log.gz 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57519.log.gz 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42286.log 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36337.log 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55796.log: /var/log/ceph/ceph-client.admin.42286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55796.log.gz 2026-03-24T17:34:59.036 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42286.log.gz 2026-03-24T17:34:59.037 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57422.log 2026-03-24T17:34:59.037 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37714.log 2026-03-24T17:34:59.037 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36337.log: /var/log/ceph/ceph-client.admin.57422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36337.log.gz 2026-03-24T17:34:59.037 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57422.log.gz 2026-03-24T17:34:59.037 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33457.log 2026-03-24T17:34:59.038 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31092.log 2026-03-24T17:34:59.038 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37714.log: /var/log/ceph/ceph-client.admin.33457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33457.log.gz 2026-03-24T17:34:59.038 INFO:teuthology.orchestra.run.vm01.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.37714.log.gz 2026-03-24T17:34:59.038 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80980.log 2026-03-24T17:34:59.038 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33777.log 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31092.log: /var/log/ceph/ceph-client.admin.80980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31092.log.gz 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80980.log.gz 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56134.log 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29885.log 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33777.log: /var/log/ceph/ceph-client.admin.56134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33777.log.gz 2026-03-24T17:34:59.039 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56134.log.gz 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65630.log 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42939.log 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29885.log: /var/log/ceph/ceph-client.admin.65630.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65630.log.gz 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.29885.log.gz 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63986.log 2026-03-24T17:34:59.040 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43401.log 2026-03-24T17:34:59.041 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42939.log: /var/log/ceph/ceph-client.admin.63986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42939.log.gz 2026-03-24T17:34:59.041 INFO:teuthology.orchestra.run.vm01.stderr: 53.2% -- replaced with /var/log/ceph/ceph-client.admin.63986.log.gz 2026-03-24T17:34:59.041 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73208.log 2026-03-24T17:34:59.041 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67942.log 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43401.log: /var/log/ceph/ceph-client.admin.73208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43401.log.gz 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73208.log.gz 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31939.log 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69915.log 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67942.log: /var/log/ceph/ceph-client.admin.31939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67942.log.gz 2026-03-24T17:34:59.042 INFO:teuthology.orchestra.run.vm01.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.31939.log.gz 2026-03-24T17:34:59.043 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37070.log 2026-03-24T17:34:59.043 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61852.log 2026-03-24T17:34:59.043 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69915.log: /var/log/ceph/ceph-client.admin.37070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69915.log.gz 2026-03-24T17:34:59.043 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37070.log.gz 2026-03-24T17:34:59.043 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63334.log 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63395.log 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61852.log: /var/log/ceph/ceph-client.admin.63334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61852.log.gz 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63334.log.gz 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80619.log 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87185.log 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63395.log: /var/log/ceph/ceph-client.admin.80619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63395.log.gz 2026-03-24T17:34:59.044 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80619.log.gz 2026-03-24T17:34:59.045 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81921.log 2026-03-24T17:34:59.045 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73960.log 2026-03-24T17:34:59.045 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87185.log: /var/log/ceph/ceph-client.admin.81921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87185.log.gz 2026-03-24T17:34:59.045 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81921.log.gz 2026-03-24T17:34:59.045 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75535.log 2026-03-24T17:34:59.046 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67211.log 2026-03-24T17:34:59.046 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73960.log: /var/log/ceph/ceph-client.admin.75535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73960.log.gz 2026-03-24T17:34:59.046 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75535.log.gz 2026-03-24T17:34:59.046 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76008.log 2026-03-24T17:34:59.046 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28299.log 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67211.log: /var/log/ceph/ceph-client.admin.76008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67211.log.gz 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76008.log.gz 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64615.log 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43922.log 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28299.log: /var/log/ceph/ceph-client.admin.64615.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28299.log.gz 2026-03-24T17:34:59.047 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64615.log.gz 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80748.log 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65780.log 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43922.log: /var/log/ceph/ceph-client.admin.80748.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43922.log.gz 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80748.log.gz 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32774.log 2026-03-24T17:34:59.048 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41928.log 2026-03-24T17:34:59.049 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65780.log: /var/log/ceph/ceph-client.admin.32774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65780.log.gz 2026-03-24T17:34:59.049 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32774.log.gz 2026-03-24T17:34:59.049 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68512.log 2026-03-24T17:34:59.049 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34437.log 2026-03-24T17:34:59.049 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68512.log.gz 2026-03-24T17:34:59.050 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41928.log.gz 2026-03-24T17:34:59.050 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77422.log 2026-03-24T17:34:59.050 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57910.log 2026-03-24T17:34:59.050 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34437.log: /var/log/ceph/ceph-client.admin.77422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34437.log.gz 2026-03-24T17:34:59.050 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77422.log.gz 2026-03-24T17:34:59.051 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60693.log 2026-03-24T17:34:59.051 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37756.log 2026-03-24T17:34:59.051 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57910.log: /var/log/ceph/ceph-client.admin.60693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57910.log.gz 2026-03-24T17:34:59.051 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60693.log.gz 2026-03-24T17:34:59.051 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86665.log 2026-03-24T17:34:59.052 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54319.log 2026-03-24T17:34:59.052 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37756.log: /var/log/ceph/ceph-client.admin.86665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86665.log.gz 2026-03-24T17:34:59.052 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37756.log.gz 2026-03-24T17:34:59.052 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83896.log 2026-03-24T17:34:59.052 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60585.log 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54319.log: /var/log/ceph/ceph-client.admin.83896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54319.log.gz 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83896.log.gz 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28149.log 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39362.log 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60585.log: /var/log/ceph/ceph-client.admin.28149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60585.log.gz 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28149.log.gz 2026-03-24T17:34:59.053 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53378.log 2026-03-24T17:34:59.054 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26094.log 2026-03-24T17:34:59.054 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39362.log: /var/log/ceph/ceph-client.admin.53378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39362.log.gz 2026-03-24T17:34:59.054 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53378.log.gz 2026-03-24T17:34:59.054 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73493.log 2026-03-24T17:34:59.054 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85824.log 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26094.log: /var/log/ceph/ceph-client.admin.73493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26094.log.gz 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73493.log.gz 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89705.log 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37672.log 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85824.log: /var/log/ceph/ceph-client.admin.89705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85824.log.gz 2026-03-24T17:34:59.055 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89705.log.gz 2026-03-24T17:34:59.056 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61545.log 2026-03-24T17:34:59.056 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39642.log 2026-03-24T17:34:59.056 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37672.log: /var/log/ceph/ceph-client.admin.61545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61545.log.gz 2026-03-24T17:34:59.056 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37672.log.gz 2026-03-24T17:34:59.056 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44486.log 2026-03-24T17:34:59.057 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70795.log 2026-03-24T17:34:59.057 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39642.log: /var/log/ceph/ceph-client.admin.44486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39642.log.gz 2026-03-24T17:34:59.057 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44486.log.gz 2026-03-24T17:34:59.057 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42145.log 2026-03-24T17:34:59.057 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36661.log 2026-03-24T17:34:59.058 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70795.log: /var/log/ceph/ceph-client.admin.42145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70795.log.gz 2026-03-24T17:34:59.058 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42145.log.gz 2026-03-24T17:34:59.058 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44103.log 2026-03-24T17:34:59.058 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70300.log 2026-03-24T17:34:59.058 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36661.log: /var/log/ceph/ceph-client.admin.44103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36661.log.gz 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.44103.log.gz 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89641.log 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44544.log 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70300.log: 0.0%/var/log/ceph/ceph-client.admin.89641.log: -- replaced with /var/log/ceph/ceph-client.admin.70300.log.gz 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89641.log.gz 2026-03-24T17:34:59.059 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46635.log 2026-03-24T17:34:59.060 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25537.log 2026-03-24T17:34:59.060 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44544.log: /var/log/ceph/ceph-client.admin.46635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46635.log.gz 2026-03-24T17:34:59.060 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.44544.log.gz 2026-03-24T17:34:59.060 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90207.log 2026-03-24T17:34:59.060 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26029.log 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25537.log: /var/log/ceph/ceph-client.admin.90207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25537.log.gz 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90207.log.gz 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82709.log 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54506.log 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26029.log: /var/log/ceph/ceph-client.admin.82709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26029.log.gz 2026-03-24T17:34:59.061 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82709.log.gz 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75858.log 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67641.log 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54506.log: /var/log/ceph/ceph-client.admin.75858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54506.log.gz 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75858.log.gz 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79842.log 2026-03-24T17:34:59.062 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35945.log 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67641.log: /var/log/ceph/ceph-client.admin.79842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67641.log.gz 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79842.log.gz 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68157.log 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87337.log 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35945.log: /var/log/ceph/ceph-client.admin.68157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35945.log.gz 0.0% 2026-03-24T17:34:59.063 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.68157.log.gz 2026-03-24T17:34:59.064 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82580.log 2026-03-24T17:34:59.064 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68471.log 2026-03-24T17:34:59.064 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87337.log: /var/log/ceph/ceph-client.admin.82580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87337.log.gz 2026-03-24T17:34:59.064 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82580.log.gz 2026-03-24T17:34:59.064 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47788.log 2026-03-24T17:34:59.065 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82838.log 2026-03-24T17:34:59.065 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68471.log: /var/log/ceph/ceph-client.admin.47788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68471.log.gz 2026-03-24T17:34:59.065 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47788.log.gz 2026-03-24T17:34:59.065 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62355.log 2026-03-24T17:34:59.065 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67964.log 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82838.log: /var/log/ceph/ceph-client.admin.62355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82838.log.gz 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.62355.log.gz 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89533.log 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87159.log 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67964.log: /var/log/ceph/ceph-client.admin.89533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67964.log.gz 2026-03-24T17:34:59.066 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89533.log.gz 2026-03-24T17:34:59.067 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34859.log 2026-03-24T17:34:59.067 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51306.log 2026-03-24T17:34:59.067 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87159.log: /var/log/ceph/ceph-client.admin.34859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87159.log.gz 2026-03-24T17:34:59.067 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34859.log.gz 2026-03-24T17:34:59.067 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26352.log 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49169.log 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51306.log: /var/log/ceph/ceph-client.admin.26352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51306.log.gz 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26352.log.gz 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61811.log 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30588.log 2026-03-24T17:34:59.068 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49169.log: /var/log/ceph/ceph-client.admin.61811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49169.log.gz 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61811.log.gz 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90184.log 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53010.log 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30588.log: /var/log/ceph/ceph-client.admin.90184.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90184.log.gz 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30588.log.gz 2026-03-24T17:34:59.069 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42507.log 2026-03-24T17:34:59.070 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53418.log 2026-03-24T17:34:59.070 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53010.log: /var/log/ceph/ceph-client.admin.42507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53010.log.gz 2026-03-24T17:34:59.070 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42507.log.gz 2026-03-24T17:34:59.070 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35707.log 2026-03-24T17:34:59.070 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67770.log 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53418.log: /var/log/ceph/ceph-client.admin.35707.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53418.log.gz 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35707.log.gz 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80146.log 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81749.log 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67770.log: /var/log/ceph/ceph-client.admin.80146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67770.log.gz 2026-03-24T17:34:59.071 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80146.log.gz 2026-03-24T17:34:59.072 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77774.log 2026-03-24T17:34:59.072 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28407.log 2026-03-24T17:34:59.072 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81749.log: /var/log/ceph/ceph-client.admin.77774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81749.log.gz 2026-03-24T17:34:59.072 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77774.log.gz 2026-03-24T17:34:59.072 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84660.log 2026-03-24T17:34:59.073 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81340.log 2026-03-24T17:34:59.073 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28407.log: /var/log/ceph/ceph-client.admin.84660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28407.log.gz 2026-03-24T17:34:59.073 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84660.log.gz 2026-03-24T17:34:59.073 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61123.log 2026-03-24T17:34:59.073 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46675.log 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81340.log: /var/log/ceph/ceph-client.admin.61123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81340.log.gz 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61123.log.gz 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25517.log 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46675.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80910.log 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46675.log.gz 2026-03-24T17:34:59.074 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.25517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25517.log.gz 2026-03-24T17:34:59.075 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84115.log 2026-03-24T17:34:59.075 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85415.log 2026-03-24T17:34:59.075 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80910.log: /var/log/ceph/ceph-client.admin.84115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80910.log.gz 2026-03-24T17:34:59.075 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84115.log.gz 2026-03-24T17:34:59.075 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36064.log 2026-03-24T17:34:59.076 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58277.log 2026-03-24T17:34:59.076 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85415.log: /var/log/ceph/ceph-client.admin.36064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85415.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36064.log.gz 2026-03-24T17:34:59.076 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.076 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83203.log 2026-03-24T17:34:59.076 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83724.log 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58277.log: /var/log/ceph/ceph-client.admin.83203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58277.log.gz 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83203.log.gz 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26739.log 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85437.log 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83724.log: /var/log/ceph/ceph-client.admin.26739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83724.log.gz 0.0% 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26739.log.gz 2026-03-24T17:34:59.077 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46482.log 2026-03-24T17:34:59.078 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40282.log 2026-03-24T17:34:59.078 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85437.log: /var/log/ceph/ceph-client.admin.46482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85437.log.gz 2026-03-24T17:34:59.078 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46482.log.gz 2026-03-24T17:34:59.078 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46573.log 2026-03-24T17:34:59.078 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69614.log 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40282.log: /var/log/ceph/ceph-client.admin.46573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40282.log.gz 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46573.log.gz 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81102.log 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33617.log 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69614.log: /var/log/ceph/ceph-client.admin.81102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69614.log.gz 2026-03-24T17:34:59.079 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81102.log.gz 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33183.log 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56704.log 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33617.log: /var/log/ceph/ceph-client.admin.33183.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33617.log.gz 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33183.log.gz 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55086.log 2026-03-24T17:34:59.080 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28213.log 2026-03-24T17:34:59.081 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56704.log: /var/log/ceph/ceph-client.admin.55086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56704.log.gz 2026-03-24T17:34:59.081 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55086.log.gz 2026-03-24T17:34:59.081 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29584.log 2026-03-24T17:34:59.081 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34557.log 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28213.log: /var/log/ceph/ceph-client.admin.29584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28213.log.gz 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29584.log.gz 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56811.log 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90244.log 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34557.log.gz/var/log/ceph/ceph-client.admin.56811.log: 2026-03-24T17:34:59.082 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56811.log.gz 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40383.log 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66678.log 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90244.log: /var/log/ceph/ceph-client.admin.40383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90244.log.gz 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40383.log.gz 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45986.log 2026-03-24T17:34:59.083 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39045.log 2026-03-24T17:34:59.084 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66678.log: /var/log/ceph/ceph-client.admin.45986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45986.log.gz 2026-03-24T17:34:59.084 INFO:teuthology.orchestra.run.vm01.stderr: 52.5% -- replaced with /var/log/ceph/ceph-client.admin.66678.log.gz 2026-03-24T17:34:59.084 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32143.log 2026-03-24T17:34:59.084 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88031.log 2026-03-24T17:34:59.084 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39045.log: /var/log/ceph/ceph-client.admin.32143.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39045.log.gz 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32143.log.gz 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42919.log 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67706.log 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88031.log: /var/log/ceph/ceph-client.admin.42919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88031.log.gz 0.0% 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.42919.log.gz 2026-03-24T17:34:59.085 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75081.log 2026-03-24T17:34:59.086 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45621.log 2026-03-24T17:34:59.086 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67706.log: /var/log/ceph/ceph-client.admin.75081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67706.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75081.log.gz 2026-03-24T17:34:59.086 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.086 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62415.log 2026-03-24T17:34:59.086 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53992.log 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45621.log: /var/log/ceph/ceph-client.admin.62415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45621.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62415.log.gz 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89039.log 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65044.log 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53992.log: /var/log/ceph/ceph-client.admin.89039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53992.log.gz 2026-03-24T17:34:59.087 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89039.log.gz 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53742.log 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44780.log 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65044.log: /var/log/ceph/ceph-client.admin.53742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65044.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53742.log.gz 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36542.log 2026-03-24T17:34:59.088 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35451.log 2026-03-24T17:34:59.089 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44780.log: /var/log/ceph/ceph-client.admin.36542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44780.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36542.log.gz 2026-03-24T17:34:59.089 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.089 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32297.log 2026-03-24T17:34:59.089 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82418.log 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35451.log: /var/log/ceph/ceph-client.admin.32297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35451.log.gz 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32297.log.gz 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39319.log 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84832.log 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82418.log: /var/log/ceph/ceph-client.admin.39319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82418.log.gz 2026-03-24T17:34:59.090 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39319.log.gz 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63039.log 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84682.log 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84832.log: /var/log/ceph/ceph-client.admin.63039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84832.log.gz 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63039.log.gz 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35119.log 2026-03-24T17:34:59.091 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87279.log 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84682.log: /var/log/ceph/ceph-client.admin.35119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84682.log.gz 0.0% 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35119.log.gz 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26567.log 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76778.log 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87279.log: /var/log/ceph/ceph-client.admin.26567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87279.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26567.log.gz 2026-03-24T17:34:59.092 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.093 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28364.log 2026-03-24T17:34:59.093 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31769.log 2026-03-24T17:34:59.093 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76778.log: /var/log/ceph/ceph-client.admin.28364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76778.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28364.log.gz 2026-03-24T17:34:59.093 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.093 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80253.log 2026-03-24T17:34:59.094 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74070.log 2026-03-24T17:34:59.094 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31769.log: /var/log/ceph/ceph-client.admin.80253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80253.log.gz 2026-03-24T17:34:59.094 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31769.log.gz 2026-03-24T17:34:59.094 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59215.log 2026-03-24T17:34:59.094 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26008.log 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74070.log: /var/log/ceph/ceph-client.admin.59215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74070.log.gz 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59215.log.gz 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86794.log 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77955.log 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26008.log: /var/log/ceph/ceph-client.admin.86794.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26008.log.gz 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86794.log.gz 2026-03-24T17:34:59.095 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82186.log 2026-03-24T17:34:59.096 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73173.log 2026-03-24T17:34:59.096 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77955.log: /var/log/ceph/ceph-client.admin.82186.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82186.log.gz 2026-03-24T17:34:59.096 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.77955.log.gz 2026-03-24T17:34:59.096 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55409.log 2026-03-24T17:34:59.096 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66975.log 2026-03-24T17:34:59.097 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73173.log: /var/log/ceph/ceph-client.admin.55409.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73173.log.gz -- replaced with /var/log/ceph/ceph-client.admin.55409.log.gz 2026-03-24T17:34:59.097 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.097 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43147.log 2026-03-24T17:34:59.097 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49987.log 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66975.log: /var/log/ceph/ceph-client.admin.43147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66975.log.gz 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.43147.log.gz 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33857.log 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77294.log 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49987.log: /var/log/ceph/ceph-client.admin.33857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49987.log.gz 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33857.log.gz 2026-03-24T17:34:59.098 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58169.log 2026-03-24T17:34:59.099 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55904.log 2026-03-24T17:34:59.099 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77294.log: /var/log/ceph/ceph-client.admin.58169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77294.log.gz 2026-03-24T17:34:59.099 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58169.log.gz 2026-03-24T17:34:59.099 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29155.log 2026-03-24T17:34:59.099 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53598.log 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55904.log: /var/log/ceph/ceph-client.admin.29155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55904.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29155.log.gz 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79627.log 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35379.log 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53598.log: /var/log/ceph/ceph-client.admin.79627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53598.log.gz 2026-03-24T17:34:59.100 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79627.log.gz 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50898.log 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85458.log 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35379.log: /var/log/ceph/ceph-client.admin.50898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35379.log.gz 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50898.log.gz 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78808.log 2026-03-24T17:34:59.101 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33837.log 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85458.log: /var/log/ceph/ceph-client.admin.78808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85458.log.gz 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78808.log.gz 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26781.log 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86472.log 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33837.log: /var/log/ceph/ceph-client.admin.26781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33837.log.gz 2026-03-24T17:34:59.102 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26781.log.gz 2026-03-24T17:34:59.103 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73978.log 2026-03-24T17:34:59.103 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53971.log 2026-03-24T17:34:59.103 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86472.log: /var/log/ceph/ceph-client.admin.73978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86472.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73978.log.gz 2026-03-24T17:34:59.103 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.103 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33897.log 2026-03-24T17:34:59.104 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31803.log 2026-03-24T17:34:59.104 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53971.log: /var/log/ceph/ceph-client.admin.33897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53971.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33897.log.gz 2026-03-24T17:34:59.104 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.104 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32263.log 2026-03-24T17:34:59.104 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32331.log 2026-03-24T17:34:59.105 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31803.log: /var/log/ceph/ceph-client.admin.32263.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32263.log.gz 2026-03-24T17:34:59.105 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31803.log.gz 2026-03-24T17:34:59.105 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38478.log 2026-03-24T17:34:59.105 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57808.log 2026-03-24T17:34:59.105 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32331.log: /var/log/ceph/ceph-client.admin.38478.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38478.log.gz 2026-03-24T17:34:59.106 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32331.log.gz 2026-03-24T17:34:59.106 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39066.log 2026-03-24T17:34:59.106 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60779.log 2026-03-24T17:34:59.106 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57808.log.gz 2026-03-24T17:34:59.106 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39066.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73190.log 2026-03-24T17:34:59.107 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.39066.log.gz 2026-03-24T17:34:59.107 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60779.log.gz 2026-03-24T17:34:59.107 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26588.log 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73190.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90138.log 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73190.log.gz 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26588.log.gz 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27699.log 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33477.log 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90138.log: 0.0%/var/log/ceph/ceph-client.admin.27699.log: -- replaced with /var/log/ceph/ceph-client.admin.90138.log.gz 2026-03-24T17:34:59.108 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27699.log.gz 2026-03-24T17:34:59.109 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82135.log 2026-03-24T17:34:59.109 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69528.log 2026-03-24T17:34:59.109 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33477.log.gz 2026-03-24T17:34:59.109 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82135.log.gz 2026-03-24T17:34:59.110 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26201.log 2026-03-24T17:34:59.110 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84004.log 2026-03-24T17:34:59.110 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69528.log.gz 2026-03-24T17:34:59.110 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26201.log.gz 2026-03-24T17:34:59.110 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43840.log 2026-03-24T17:34:59.111 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30877.log 2026-03-24T17:34:59.111 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84004.log: /var/log/ceph/ceph-client.admin.43840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84004.log.gz 2026-03-24T17:34:59.111 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43840.log.gz 2026-03-24T17:34:59.111 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36848.log 2026-03-24T17:34:59.111 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44672.log 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30877.log: /var/log/ceph/ceph-client.admin.36848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30877.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36848.log.gz 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62064.log 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71397.log 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44672.log: /var/log/ceph/ceph-client.admin.62064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44672.log.gz 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.62064.log.gz 2026-03-24T17:34:59.112 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34157.log 2026-03-24T17:34:59.113 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72836.log 2026-03-24T17:34:59.113 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71397.log: /var/log/ceph/ceph-client.admin.34157.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71397.log.gz -- replaced with /var/log/ceph/ceph-client.admin.34157.log.gz 2026-03-24T17:34:59.113 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.113 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33046.log 2026-03-24T17:34:59.113 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72836.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65360.log 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72836.log.gz 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33046.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33046.log.gz 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79477.log 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66124.log 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65360.log: /var/log/ceph/ceph-client.admin.79477.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79477.log.gz 2026-03-24T17:34:59.114 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65360.log.gz 2026-03-24T17:34:59.115 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79219.log 2026-03-24T17:34:59.115 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58148.log 2026-03-24T17:34:59.115 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66124.log: /var/log/ceph/ceph-client.admin.79219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66124.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79219.log.gz 2026-03-24T17:34:59.115 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.115 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47849.log 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82258.log 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58148.log: /var/log/ceph/ceph-client.admin.47849.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47849.log.gz 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.58148.log.gz 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71515.log 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69956.log 2026-03-24T17:34:59.116 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82258.log: /var/log/ceph/ceph-client.admin.71515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71515.log.gz 2026-03-24T17:34:59.117 INFO:teuthology.orchestra.run.vm01.stderr: 86.9% -- replaced with /var/log/ceph/ceph-client.admin.82258.log.gz 2026-03-24T17:34:59.117 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43323.log 2026-03-24T17:34:59.117 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58019.log 2026-03-24T17:34:59.117 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43323.log.gz 2026-03-24T17:34:59.118 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69956.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88215.log 2026-03-24T17:34:59.118 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69956.log.gz 2026-03-24T17:34:59.118 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58019.log.gz 2026-03-24T17:34:59.118 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81405.log 2026-03-24T17:34:59.118 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46324.log 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88215.log: /var/log/ceph/ceph-client.admin.81405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88215.log.gz 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81405.log.gz 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39577.log 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50008.log 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46324.log: /var/log/ceph/ceph-client.admin.39577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46324.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39577.log.gz 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.119 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81426.log 2026-03-24T17:34:59.120 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80998.log 2026-03-24T17:34:59.120 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50008.log: /var/log/ceph/ceph-client.admin.81426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50008.log.gz 0.0% 2026-03-24T17:34:59.120 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.81426.log.gz 2026-03-24T17:34:59.120 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33200.log 2026-03-24T17:34:59.120 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77251.log 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80998.log: /var/log/ceph/ceph-client.admin.33200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80998.log.gz 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33200.log.gz 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77826.log 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31027.log 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77251.log.gz 2026-03-24T17:34:59.121 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.77826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77826.log.gz 2026-03-24T17:34:59.122 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85004.log 2026-03-24T17:34:59.122 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50116.log 2026-03-24T17:34:59.122 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31027.log: /var/log/ceph/ceph-client.admin.85004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31027.log.gz 2026-03-24T17:34:59.122 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85004.log.gz 2026-03-24T17:34:59.122 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61562.log 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58959.log 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50116.log: /var/log/ceph/ceph-client.admin.61562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50116.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61562.log.gz 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78927.log 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31718.log 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58959.log: /var/log/ceph/ceph-client.admin.78927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58959.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78927.log.gz 2026-03-24T17:34:59.123 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.124 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27658.log 2026-03-24T17:34:59.124 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78143.log 2026-03-24T17:34:59.124 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31718.log: /var/log/ceph/ceph-client.admin.27658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27658.log.gz 2026-03-24T17:34:59.124 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31718.log.gz 2026-03-24T17:34:59.124 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80232.log 2026-03-24T17:34:59.125 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52989.log 2026-03-24T17:34:59.125 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78143.log: /var/log/ceph/ceph-client.admin.80232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78143.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80232.log.gz 2026-03-24T17:34:59.125 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.125 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49578.log 2026-03-24T17:34:59.125 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36252.log 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52989.log: /var/log/ceph/ceph-client.admin.49578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52989.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49578.log.gz 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26850.log 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34757.log 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36252.log: /var/log/ceph/ceph-client.admin.26850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36252.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26850.log.gz 2026-03-24T17:34:59.126 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85845.log 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73645.log 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34757.log: /var/log/ceph/ceph-client.admin.85845.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85845.log.gz 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.34757.log.gz 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86515.log 2026-03-24T17:34:59.127 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53702.log 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73645.log: /var/log/ceph/ceph-client.admin.86515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73645.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86515.log.gz 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62453.log 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63143.log 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53702.log: /var/log/ceph/ceph-client.admin.62453.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53702.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62453.log.gz 2026-03-24T17:34:59.128 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.129 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90091.log 2026-03-24T17:34:59.129 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34277.log 2026-03-24T17:34:59.129 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63143.log: /var/log/ceph/ceph-client.admin.90091.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90091.log.gz 2026-03-24T17:34:59.129 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.63143.log.gz 2026-03-24T17:34:59.130 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62396.log 2026-03-24T17:34:59.130 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52774.log 2026-03-24T17:34:59.130 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34277.log: /var/log/ceph/ceph-client.admin.62396.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62396.log.gz 2026-03-24T17:34:59.130 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.34277.log.gz 2026-03-24T17:34:59.130 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70429.log 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29455.log 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52774.log: /var/log/ceph/ceph-client.admin.70429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52774.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70429.log.gz 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80892.log 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35259.log 2026-03-24T17:34:59.131 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29455.log: /var/log/ceph/ceph-client.admin.80892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29455.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80892.log.gz 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42713.log 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40660.log 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35259.log: /var/log/ceph/ceph-client.admin.42713.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42713.log.gz 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35259.log.gz 2026-03-24T17:34:59.132 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31837.log 2026-03-24T17:34:59.133 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41804.log 2026-03-24T17:34:59.133 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40660.log: /var/log/ceph/ceph-client.admin.31837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40660.log.gz 2026-03-24T17:34:59.133 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31837.log.gz 2026-03-24T17:34:59.133 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41600.log 2026-03-24T17:34:59.133 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27720.log 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41804.log: /var/log/ceph/ceph-client.admin.41600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41804.log.gz 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.41600.log.gz 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68308.log 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85198.log 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27720.log: /var/log/ceph/ceph-client.admin.68308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27720.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68308.log.gz 2026-03-24T17:34:59.134 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77886.log 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65974.log 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85198.log: /var/log/ceph/ceph-client.admin.77886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85198.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77886.log.gz 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60392.log 2026-03-24T17:34:59.135 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31905.log 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65974.log: /var/log/ceph/ceph-client.admin.60392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65974.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60392.log.gz 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47070.log 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50346.log 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31905.log: /var/log/ceph/ceph-client.admin.47070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47070.log.gz 2026-03-24T17:34:59.136 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31905.log.gz 2026-03-24T17:34:59.137 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74124.log 2026-03-24T17:34:59.137 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36457.log 2026-03-24T17:34:59.137 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50346.log.gz 2026-03-24T17:34:59.137 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74124.log.gz 2026-03-24T17:34:59.137 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70537.log 2026-03-24T17:34:59.138 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28127.log 2026-03-24T17:34:59.138 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36457.log: /var/log/ceph/ceph-client.admin.70537.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36457.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70537.log.gz 2026-03-24T17:34:59.138 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.138 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84875.log 2026-03-24T17:34:59.138 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68093.log 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28127.log: /var/log/ceph/ceph-client.admin.84875.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28127.log.gz -- replaced with /var/log/ceph/ceph-client.admin.84875.log.gz 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41763.log 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73136.log 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68093.log: /var/log/ceph/ceph-client.admin.41763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68093.log.gz 2026-03-24T17:34:59.139 INFO:teuthology.orchestra.run.vm01.stderr: 56.5% -- replaced with /var/log/ceph/ceph-client.admin.41763.log.gz 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74224.log 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63374.log 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73136.log: /var/log/ceph/ceph-client.admin.74224.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74224.log.gz 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.73136.log.gz 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34697.log 2026-03-24T17:34:59.140 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45145.log 2026-03-24T17:34:59.141 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63374.log: /var/log/ceph/ceph-client.admin.34697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34697.log.gz 2026-03-24T17:34:59.141 INFO:teuthology.orchestra.run.vm01.stderr: 55.3% -- replaced with /var/log/ceph/ceph-client.admin.63374.log.gz 2026-03-24T17:34:59.141 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43282.log 2026-03-24T17:34:59.141 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73100.log 2026-03-24T17:34:59.141 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45145.log: /var/log/ceph/ceph-client.admin.43282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45145.log.gz 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.43282.log.gz 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39255.log 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60220.log 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73100.log: /var/log/ceph/ceph-client.admin.39255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73100.log.gz 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39255.log.gz 2026-03-24T17:34:59.142 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30779.log 2026-03-24T17:34:59.143 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50218.log 2026-03-24T17:34:59.143 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60220.log: /var/log/ceph/ceph-client.admin.30779.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30779.log.gz 2026-03-24T17:34:59.143 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.60220.log.gz 2026-03-24T17:34:59.143 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70752.log 2026-03-24T17:34:59.143 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89340.log 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50218.log: /var/log/ceph/ceph-client.admin.70752.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70752.log.gz 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.50218.log.gz 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64338.log 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46552.log 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89340.log: /var/log/ceph/ceph-client.admin.64338.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64338.log.gz 2026-03-24T17:34:59.144 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.89340.log.gz 2026-03-24T17:34:59.145 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36269.log 2026-03-24T17:34:59.145 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33337.log 2026-03-24T17:34:59.145 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46552.log: /var/log/ceph/ceph-client.admin.36269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46552.log.gz 2026-03-24T17:34:59.145 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36269.log.gz 2026-03-24T17:34:59.146 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62687.log 2026-03-24T17:34:59.146 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35219.log 2026-03-24T17:34:59.146 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33337.log: /var/log/ceph/ceph-client.admin.62687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62687.log.gz 2026-03-24T17:34:59.146 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33337.log.gz 2026-03-24T17:34:59.146 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58040.log 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64670.log 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.35219.log: /var/log/ceph/ceph-client.admin.58040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35219.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58040.log.gz 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49901.log 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29347.log 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64670.log: /var/log/ceph/ceph-client.admin.49901.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49901.log.gz 2026-03-24T17:34:59.147 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.64670.log.gz 2026-03-24T17:34:59.148 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72337.log 2026-03-24T17:34:59.148 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89684.log 2026-03-24T17:34:59.148 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29347.log: /var/log/ceph/ceph-client.admin.72337.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29347.log.gz -- replaced with /var/log/ceph/ceph-client.admin.72337.log.gz 2026-03-24T17:34:59.148 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.148 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43982.log 2026-03-24T17:34:59.149 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47731.log 2026-03-24T17:34:59.149 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89684.log: /var/log/ceph/ceph-client.admin.43982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89684.log.gz 2026-03-24T17:34:59.149 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.43982.log.gz 2026-03-24T17:34:59.149 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48530.log 2026-03-24T17:34:59.149 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83746.log 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47731.log: /var/log/ceph/ceph-client.admin.48530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47731.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48530.log.gz 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63102.log 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58642.log 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83746.log: /var/log/ceph/ceph-client.admin.63102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83746.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63102.log.gz 2026-03-24T17:34:59.150 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.154 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80576.log 2026-03-24T17:34:59.154 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34737.log 2026-03-24T17:34:59.154 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80576.log.gz 2026-03-24T17:34:59.154 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58642.log.gz 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43740.log 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84617.log 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34737.log: /var/log/ceph/ceph-client.admin.43740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34737.log.gz 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43740.log.gz 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48570.log 2026-03-24T17:34:59.155 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34879.log 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84617.log: /var/log/ceph/ceph-client.admin.48570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84617.log.gz 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48570.log.gz 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46614.log 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22851.log 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34879.log: /var/log/ceph/ceph-client.admin.46614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34879.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46614.log.gz 2026-03-24T17:34:59.156 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.157 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50941.log 2026-03-24T17:34:59.157 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85867.log 2026-03-24T17:34:59.157 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.22851.log: /var/log/ceph/ceph-client.admin.50941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22851.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50941.log.gz 2026-03-24T17:34:59.157 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.157 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36933.log 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64964.log 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85867.log: /var/log/ceph/ceph-client.admin.36933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85867.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36933.log.gz 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86966.log 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72975.log 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64964.log: /var/log/ceph/ceph-client.admin.86966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64964.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86966.log.gz 2026-03-24T17:34:59.158 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.159 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86579.log 2026-03-24T17:34:59.159 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38751.log 2026-03-24T17:34:59.159 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72975.log: /var/log/ceph/ceph-client.admin.86579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72975.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86579.log.gz 2026-03-24T17:34:59.159 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.159 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36899.log 2026-03-24T17:34:59.160 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40071.log 2026-03-24T17:34:59.160 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38751.log: /var/log/ceph/ceph-client.admin.36899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36899.log.gz 2026-03-24T17:34:59.160 INFO:teuthology.orchestra.run.vm01.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.38751.log.gz 2026-03-24T17:34:59.160 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40762.log 2026-03-24T17:34:59.160 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67254.log 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40071.log: /var/log/ceph/ceph-client.admin.40762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40071.log.gz 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.40762.log.gz 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54570.log 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79864.log 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67254.log: /var/log/ceph/ceph-client.admin.54570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67254.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54570.log.gz 2026-03-24T17:34:59.161 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.162 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78469.log 2026-03-24T17:34:59.162 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34477.log 2026-03-24T17:34:59.162 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79864.log: /var/log/ceph/ceph-client.admin.78469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79864.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78469.log.gz 2026-03-24T17:34:59.162 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.162 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40300.log 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42899.log 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34477.log: /var/log/ceph/ceph-client.admin.40300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34477.log.gz 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40300.log.gz 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78615.log 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31285.log 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42899.log: /var/log/ceph/ceph-client.admin.78615.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42899.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78615.log.gz 2026-03-24T17:34:59.163 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.164 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65085.log 2026-03-24T17:34:59.164 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42103.log 2026-03-24T17:34:59.164 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31285.log: /var/log/ceph/ceph-client.admin.65085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31285.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65085.log.gz 2026-03-24T17:34:59.164 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.164 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31701.log 2026-03-24T17:34:59.165 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36115.log 2026-03-24T17:34:59.165 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42103.log: /var/log/ceph/ceph-client.admin.31701.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42103.log.gz 2026-03-24T17:34:59.165 INFO:teuthology.orchestra.run.vm01.stderr: 3.8% -- replaced with /var/log/ceph/ceph-client.admin.31701.log.gz 2026-03-24T17:34:59.165 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73118.log 2026-03-24T17:34:59.165 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89958.log 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36115.log: /var/log/ceph/ceph-client.admin.73118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36115.log.gz 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73118.log.gz 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83488.log 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89297.log 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89958.log: /var/log/ceph/ceph-client.admin.83488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83488.log.gz 2026-03-24T17:34:59.166 INFO:teuthology.orchestra.run.vm01.stderr: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.89958.log.gz 2026-03-24T17:34:59.167 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74336.log 2026-03-24T17:34:59.167 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65544.log 2026-03-24T17:34:59.167 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89297.log: /var/log/ceph/ceph-client.admin.74336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89297.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74336.log.gz 2026-03-24T17:34:59.167 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.167 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62706.log 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71376.log 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65544.log: /var/log/ceph/ceph-client.admin.62706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65544.log.gz 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62706.log.gz 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72144.log 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80103.log 2026-03-24T17:34:59.168 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71376.log: /var/log/ceph/ceph-client.admin.72144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72144.log.gz 2026-03-24T17:34:59.169 INFO:teuthology.orchestra.run.vm01.stderr: 55.3% -- replaced with /var/log/ceph/ceph-client.admin.71376.log.gz 2026-03-24T17:34:59.169 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86038.log 2026-03-24T17:34:59.169 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26158.log 2026-03-24T17:34:59.169 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80103.log: /var/log/ceph/ceph-client.admin.86038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80103.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86038.log.gz 2026-03-24T17:34:59.169 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28694.log 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60607.log 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26158.log: /var/log/ceph/ceph-client.admin.28694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26158.log.gz 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.28694.log.gz 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49471.log 2026-03-24T17:34:59.170 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56639.log 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60607.log: /var/log/ceph/ceph-client.admin.49471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60607.log.gz 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49471.log.gz 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71950.log 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31871.log 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56639.log: /var/log/ceph/ceph-client.admin.71950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56639.log.gz 2026-03-24T17:34:59.171 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71950.log.gz 2026-03-24T17:34:59.172 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45427.log 2026-03-24T17:34:59.172 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43196.log 2026-03-24T17:34:59.172 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31871.log: /var/log/ceph/ceph-client.admin.45427.log: 1.2% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45427.log.gz 2026-03-24T17:34:59.172 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.31871.log.gz 2026-03-24T17:34:59.172 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63471.log 2026-03-24T17:34:59.173 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72887.log 2026-03-24T17:34:59.173 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43196.log: /var/log/ceph/ceph-client.admin.63471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63471.log.gz 2026-03-24T17:34:59.173 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.43196.log.gz 2026-03-24T17:34:59.173 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59489.log 2026-03-24T17:34:59.174 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39814.log 2026-03-24T17:34:59.174 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72887.log: /var/log/ceph/ceph-client.admin.59489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72887.log.gz 2026-03-24T17:34:59.174 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59489.log.gz 2026-03-24T17:34:59.174 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58749.log 2026-03-24T17:34:59.174 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60564.log 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39814.log: /var/log/ceph/ceph-client.admin.58749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39814.log.gz 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58749.log.gz 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85394.log 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32058.log 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60564.log: /var/log/ceph/ceph-client.admin.85394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60564.log.gz 2026-03-24T17:34:59.175 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85394.log.gz 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38835.log 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27083.log 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32058.log: /var/log/ceph/ceph-client.admin.38835.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32058.log.gz 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38835.log.gz 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76509.log 2026-03-24T17:34:59.176 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87117.log 2026-03-24T17:34:59.177 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27083.log: /var/log/ceph/ceph-client.admin.76509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76509.log.gz 2026-03-24T17:34:59.177 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27083.log.gz 2026-03-24T17:34:59.177 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27331.log 2026-03-24T17:34:59.177 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60241.log 2026-03-24T17:34:59.177 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87117.log: /var/log/ceph/ceph-client.admin.27331.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87117.log.gz 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27331.log.gz 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66740.log 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81663.log 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60241.log: /var/log/ceph/ceph-client.admin.66740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60241.log.gz 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66740.log.gz 2026-03-24T17:34:59.178 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71848.log 2026-03-24T17:34:59.179 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61015.log 2026-03-24T17:34:59.179 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81663.log: /var/log/ceph/ceph-client.admin.71848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81663.log.gz 2026-03-24T17:34:59.179 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71848.log.gz 2026-03-24T17:34:59.179 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67039.log 2026-03-24T17:34:59.179 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41583.log 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61015.log: /var/log/ceph/ceph-client.admin.67039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61015.log.gz 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67039.log.gz 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48493.log 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33717.log 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41583.log: /var/log/ceph/ceph-client.admin.48493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41583.log.gz 2026-03-24T17:34:59.180 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.48493.log.gz 2026-03-24T17:34:59.181 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83875.log 2026-03-24T17:34:59.181 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41515.log 2026-03-24T17:34:59.181 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33717.log: /var/log/ceph/ceph-client.admin.83875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33717.log.gz 2026-03-24T17:34:59.181 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83875.log.gz 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83466.log 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37420.log 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41515.log: /var/log/ceph/ceph-client.admin.83466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41515.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83466.log.gz 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86281.log 2026-03-24T17:34:59.182 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26524.log 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37420.log: /var/log/ceph/ceph-client.admin.86281.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86281.log.gz 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37420.log.gz 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45900.log 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69700.log 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26524.log: /var/log/ceph/ceph-client.admin.45900.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45900.log.gz 2026-03-24T17:34:59.183 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26524.log.gz 2026-03-24T17:34:59.184 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69005.log 2026-03-24T17:34:59.184 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34417.log 2026-03-24T17:34:59.184 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69700.log: /var/log/ceph/ceph-client.admin.69005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69700.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69005.log.gz 2026-03-24T17:34:59.184 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.184 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50324.log 2026-03-24T17:34:59.185 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46181.log 2026-03-24T17:34:59.185 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34417.log: /var/log/ceph/ceph-client.admin.50324.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34417.log.gz -- replaced with /var/log/ceph/ceph-client.admin.50324.log.gz 2026-03-24T17:34:59.185 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.185 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86361.log 2026-03-24T17:34:59.185 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57048.log 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46181.log: /var/log/ceph/ceph-client.admin.86361.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86361.log.gz 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.46181.log.gz 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66805.log 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43127.log 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57048.log: /var/log/ceph/ceph-client.admin.66805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57048.log.gz 0.0% 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.66805.log.gz 2026-03-24T17:34:59.186 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90661.log 2026-03-24T17:34:59.187 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69045.log 2026-03-24T17:34:59.187 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43127.log: /var/log/ceph/ceph-client.admin.90661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43127.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90661.log.gz 2026-03-24T17:34:59.187 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.187 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55753.log 2026-03-24T17:34:59.187 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58728.log 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69045.log: /var/log/ceph/ceph-client.admin.55753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69045.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55753.log.gz 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61359.log 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74716.log 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58728.log: /var/log/ceph/ceph-client.admin.61359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58728.log.gz 2026-03-24T17:34:59.188 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61359.log.gz 2026-03-24T17:34:59.189 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22827.log 2026-03-24T17:34:59.189 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33677.log 2026-03-24T17:34:59.189 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74716.log: /var/log/ceph/ceph-client.admin.22827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74716.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22827.log.gz 2026-03-24T17:34:59.189 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.189 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61338.log 2026-03-24T17:34:59.190 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83961.log 2026-03-24T17:34:59.190 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33677.log: /var/log/ceph/ceph-client.admin.61338.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61338.log.gz 2026-03-24T17:34:59.190 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.33677.log.gz 2026-03-24T17:34:59.190 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45599.log 2026-03-24T17:34:59.190 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73009.log 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83961.log: /var/log/ceph/ceph-client.admin.45599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83961.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45599.log.gz 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30641.log 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29133.log 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73009.log: /var/log/ceph/ceph-client.admin.30641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73009.log.gz 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30641.log.gz 2026-03-24T17:34:59.191 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71335.log 2026-03-24T17:34:59.192 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76526.log 2026-03-24T17:34:59.192 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29133.log: /var/log/ceph/ceph-client.admin.71335.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29133.log.gz 2026-03-24T17:34:59.192 INFO:teuthology.orchestra.run.vm01.stderr: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.71335.log.gz 2026-03-24T17:34:59.192 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60026.log 2026-03-24T17:34:59.192 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62105.log 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76526.log: /var/log/ceph/ceph-client.admin.60026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76526.log.gz 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60026.log.gz 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75339.log 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58578.log 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62105.log: /var/log/ceph/ceph-client.admin.75339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62105.log.gz 2026-03-24T17:34:59.193 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75339.log.gz 2026-03-24T17:34:59.194 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60349.log 2026-03-24T17:34:59.194 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44629.log 2026-03-24T17:34:59.194 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58578.log: /var/log/ceph/ceph-client.admin.60349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58578.log.gz 2026-03-24T17:34:59.194 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60349.log.gz 2026-03-24T17:34:59.194 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57177.log 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38604.log 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44629.log: /var/log/ceph/ceph-client.admin.57177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44629.log.gz 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57177.log.gz 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43642.log 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30723.log 2026-03-24T17:34:59.195 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38604.log: /var/log/ceph/ceph-client.admin.43642.log: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.38604.log.gz 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr: 31.8% -- replaced with /var/log/ceph/ceph-client.admin.43642.log.gz 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59470.log 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61058.log 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30723.log: /var/log/ceph/ceph-client.admin.59470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59470.log.gz 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.30723.log.gz 2026-03-24T17:34:59.196 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74389.log 2026-03-24T17:34:59.197 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53458.log 2026-03-24T17:34:59.197 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61058.log: /var/log/ceph/ceph-client.admin.74389.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61058.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74389.log.gz 2026-03-24T17:34:59.197 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.197 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26051.log 2026-03-24T17:34:59.197 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31990.log 2026-03-24T17:34:59.198 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53458.log: /var/log/ceph/ceph-client.admin.26051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53458.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26051.log.gz 2026-03-24T17:34:59.198 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.198 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41742.log 2026-03-24T17:34:59.198 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71972.log 2026-03-24T17:34:59.198 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31990.log: /var/log/ceph/ceph-client.admin.41742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41742.log.gz 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62045.log 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31990.log.gz 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32620.log 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62045.log: /var/log/ceph/ceph-client.admin.71972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62045.log.gz 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71972.log.gz 2026-03-24T17:34:59.199 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90583.log 2026-03-24T17:34:59.200 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38709.log 2026-03-24T17:34:59.200 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32620.log: /var/log/ceph/ceph-client.admin.90583.log: 0.0% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.90583.log.gz 2026-03-24T17:34:59.200 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.32620.log.gz 2026-03-24T17:34:59.200 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43003.log 2026-03-24T17:34:59.200 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70193.log 2026-03-24T17:34:59.201 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38709.log: /var/log/ceph/ceph-client.admin.43003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43003.log.gz 2026-03-24T17:34:59.201 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38709.log.gz 2026-03-24T17:34:59.201 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34297.log 2026-03-24T17:34:59.201 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39192.log 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70193.log: /var/log/ceph/ceph-client.admin.34297.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34297.log.gz 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.70193.log.gz 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88931.log 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42166.log 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39192.log: /var/log/ceph/ceph-client.admin.88931.log: 25.7% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88931.log.gz 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.39192.log.gz 2026-03-24T17:34:59.202 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30551.log 2026-03-24T17:34:59.203 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90019.log 2026-03-24T17:34:59.203 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42166.log: /var/log/ceph/ceph-client.admin.30551.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30551.log.gz 2026-03-24T17:34:59.203 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.42166.log.gz 2026-03-24T17:34:59.203 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80961.log 2026-03-24T17:34:59.203 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69399.log 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.90019.log: /var/log/ceph/ceph-client.admin.80961.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80961.log.gz 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.90019.log.gz 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28791.log 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82254.log 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69399.log: /var/log/ceph/ceph-client.admin.28791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69399.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28791.log.gz 2026-03-24T17:34:59.204 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.205 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63257.log 2026-03-24T17:34:59.205 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46655.log 2026-03-24T17:34:59.205 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82254.log: /var/log/ceph/ceph-client.admin.63257.log: 53.2% -- replaced with /var/log/ceph/ceph-client.admin.63257.log.gz 2026-03-24T17:34:59.205 INFO:teuthology.orchestra.run.vm01.stderr: 85.2% -- replaced with /var/log/ceph/ceph-client.admin.82254.log.gz 2026-03-24T17:34:59.205 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55302.log 2026-03-24T17:34:59.206 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29090.log 2026-03-24T17:34:59.206 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46655.log: /var/log/ceph/ceph-client.admin.55302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46655.log.gz 2026-03-24T17:34:59.206 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55302.log.gz 2026-03-24T17:34:59.206 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50876.log 2026-03-24T17:34:59.206 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62510.log 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29090.log: /var/log/ceph/ceph-client.admin.50876.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29090.log.gz 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50876.log.gz 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89662.log 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33115.log 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62510.log: /var/log/ceph/ceph-client.admin.89662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62510.log.gz 2026-03-24T17:34:59.207 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89662.log.gz 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42208.log 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74909.log 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33115.log: /var/log/ceph/ceph-client.admin.42208.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33115.log.gz 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.42208.log.gz 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88372.log 2026-03-24T17:34:59.208 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39792.log 2026-03-24T17:34:59.209 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74909.log: /var/log/ceph/ceph-client.admin.88372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74909.log.gz 2026-03-24T17:34:59.209 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88372.log.gz 2026-03-24T17:34:59.209 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40092.log 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39792.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28811.log 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39792.log.gz 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40092.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40092.log.gz 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42528.log 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28811.log.gz 2026-03-24T17:34:59.210 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77671.log 2026-03-24T17:34:59.211 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87009.log 2026-03-24T17:34:59.211 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42528.log: /var/log/ceph/ceph-client.admin.77671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42528.log.gz 2026-03-24T17:34:59.211 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77671.log.gz 2026-03-24T17:34:59.211 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55193.log 2026-03-24T17:34:59.211 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82237.log 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87009.log: /var/log/ceph/ceph-client.admin.55193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87009.log.gz 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55193.log.gz 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89999.log 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60757.log 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82237.log: /var/log/ceph/ceph-client.admin.89999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82237.log.gz 2026-03-24T17:34:59.212 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89999.log.gz 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63491.log 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40442.log 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60757.log: /var/log/ceph/ceph-client.admin.63491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60757.log.gz 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63491.log.gz 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88501.log 2026-03-24T17:34:59.213 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59962.log 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40442.log: /var/log/ceph/ceph-client.admin.88501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40442.log.gz 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88501.log.gz 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85523.log 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56553.log 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59962.log: /var/log/ceph/ceph-client.admin.85523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59962.log.gz 2026-03-24T17:34:59.214 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85523.log.gz 2026-03-24T17:34:59.215 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80060.log 2026-03-24T17:34:59.215 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61640.log 2026-03-24T17:34:59.215 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56553.log: /var/log/ceph/ceph-client.admin.80060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56553.log.gz 2026-03-24T17:34:59.215 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80060.log.gz 2026-03-24T17:34:59.215 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57666.log 2026-03-24T17:34:59.216 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83402.log 2026-03-24T17:34:59.216 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61640.log: /var/log/ceph/ceph-client.admin.57666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61640.log.gz 2026-03-24T17:34:59.216 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57666.log.gz 2026-03-24T17:34:59.216 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47771.log 2026-03-24T17:34:59.216 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38268.log 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83402.log: /var/log/ceph/ceph-client.admin.47771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83402.log.gz 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.47771.log.gz 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52602.log 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75471.log 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38268.log: /var/log/ceph/ceph-client.admin.52602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52602.log.gz 2026-03-24T17:34:59.217 INFO:teuthology.orchestra.run.vm01.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.38268.log.gz 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78038.log 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26674.log 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75471.log: /var/log/ceph/ceph-client.admin.78038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75471.log.gz 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78038.log.gz 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50597.log 2026-03-24T17:34:59.218 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56028.log 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26674.log: /var/log/ceph/ceph-client.admin.50597.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50597.log.gz 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26674.log.gz 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55107.log 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55882.log 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56028.log: /var/log/ceph/ceph-client.admin.55107.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55107.log.gz 2026-03-24T17:34:59.219 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.56028.log.gz 2026-03-24T17:34:59.220 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34677.log 2026-03-24T17:34:59.220 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55839.log 2026-03-24T17:34:59.220 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55882.log: /var/log/ceph/ceph-client.admin.34677.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34677.log.gz 2026-03-24T17:34:59.220 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.55882.log.gz 2026-03-24T17:34:59.220 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57724.log 2026-03-24T17:34:59.221 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58901.log 2026-03-24T17:34:59.221 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55839.log: /var/log/ceph/ceph-client.admin.57724.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57724.log.gz 2026-03-24T17:34:59.221 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.55839.log.gz 2026-03-24T17:34:59.221 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35775.log 2026-03-24T17:34:59.222 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61488.log 2026-03-24T17:34:59.222 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58901.log: /var/log/ceph/ceph-client.admin.35775.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35775.log.gz 2026-03-24T17:34:59.222 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58901.log.gz 2026-03-24T17:34:59.222 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70257.log 2026-03-24T17:34:59.222 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39298.log 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61488.log: /var/log/ceph/ceph-client.admin.70257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61488.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70257.log.gz 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65438.log 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27501.log 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39298.log: /var/log/ceph/ceph-client.admin.65438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39298.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65438.log.gz 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.223 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42776.log 2026-03-24T17:34:59.224 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26805.log 2026-03-24T17:34:59.224 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27501.log: /var/log/ceph/ceph-client.admin.42776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27501.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42776.log.gz 2026-03-24T17:34:59.224 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.224 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35877.log 2026-03-24T17:34:59.224 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67856.log 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.26805.log: /var/log/ceph/ceph-client.admin.35877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26805.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35877.log.gz 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31328.log 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50984.log 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67856.log: /var/log/ceph/ceph-client.admin.31328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67856.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31328.log.gz 2026-03-24T17:34:59.225 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62821.log 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83074.log 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50984.log: /var/log/ceph/ceph-client.admin.62821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50984.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62821.log.gz 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83380.log 2026-03-24T17:34:59.226 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67340.log 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83074.log: /var/log/ceph/ceph-client.admin.83380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83380.log.gz 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr: 55.9% -- replaced with /var/log/ceph/ceph-client.admin.83074.log.gz 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73261.log 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71535.log 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67340.log: /var/log/ceph/ceph-client.admin.73261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67340.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73261.log.gz 2026-03-24T17:34:59.227 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.228 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73681.log 2026-03-24T17:34:59.228 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78598.log 2026-03-24T17:34:59.228 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.71535.log: /var/log/ceph/ceph-client.admin.73681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73681.log.gz 2026-03-24T17:34:59.228 INFO:teuthology.orchestra.run.vm01.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.71535.log.gz 2026-03-24T17:34:59.228 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30293.log 2026-03-24T17:34:59.229 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84897.log 2026-03-24T17:34:59.229 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78598.log: /var/log/ceph/ceph-client.admin.30293.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78598.log.gz 0.0% 2026-03-24T17:34:59.229 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30293.log.gz 2026-03-24T17:34:59.229 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35553.log 2026-03-24T17:34:59.229 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72785.log 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84897.log: /var/log/ceph/ceph-client.admin.35553.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35553.log.gz 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.84897.log.gz 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35099.log 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52518.log 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72785.log: /var/log/ceph/ceph-client.admin.35099.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72785.log.gz -- replaced with /var/log/ceph/ceph-client.admin.35099.log.gz 2026-03-24T17:34:59.230 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.231 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73417.log 2026-03-24T17:34:59.231 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29498.log 2026-03-24T17:34:59.231 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52518.log: /var/log/ceph/ceph-client.admin.73417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73417.log.gz 2026-03-24T17:34:59.231 INFO:teuthology.orchestra.run.vm01.stderr: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.52518.log.gz 2026-03-24T17:34:59.231 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43902.log 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82437.log 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29498.log: /var/log/ceph/ceph-client.admin.43902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29498.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43902.log.gz 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51134.log 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49664.log 2026-03-24T17:34:59.232 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82437.log: /var/log/ceph/ceph-client.admin.51134.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51134.log.gz 2026-03-24T17:34:59.233 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.82437.log.gz 2026-03-24T17:34:59.233 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50833.log 2026-03-24T17:34:59.233 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42879.log 2026-03-24T17:34:59.233 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49664.log: /var/log/ceph/ceph-client.admin.50833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49664.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50833.log.gz 2026-03-24T17:34:59.233 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.234 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61661.log 2026-03-24T17:34:59.234 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82220.log 2026-03-24T17:34:59.234 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42879.log: /var/log/ceph/ceph-client.admin.61661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42879.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61661.log.gz 2026-03-24T17:34:59.234 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.234 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79649.log 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67491.log 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.82220.log: /var/log/ceph/ceph-client.admin.79649.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82220.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79649.log.gz 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26924.log 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39384.log 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67491.log: /var/log/ceph/ceph-client.admin.26924.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81085.log 2026-03-24T17:34:59.235 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26924.log.gz 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67491.log.gz 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39384.log.gz 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57684.log 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62585.log 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81085.log: /var/log/ceph/ceph-client.admin.57684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87219.log 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57684.log.gz 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81085.log.gz 2026-03-24T17:34:59.236 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62585.log.gz 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52860.log 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40722.log 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87219.log: /var/log/ceph/ceph-client.admin.52860.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70386.log 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52860.log.gz 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87219.log.gz 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40722.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40722.log.gz 2026-03-24T17:34:59.237 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54871.log 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47972.log 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70386.log: /var/log/ceph/ceph-client.admin.54871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75600.log 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54871.log.gz 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70386.log.gz 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68680.log 2026-03-24T17:34:59.238 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73337.log 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75600.log: /var/log/ceph/ceph-client.admin.68680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75600.log.gz 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68680.log.gz 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44165.log 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42409.log 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.73337.log: /var/log/ceph/ceph-client.admin.44165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73337.log.gz 2026-03-24T17:34:59.239 INFO:teuthology.orchestra.run.vm01.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.44165.log.gz 2026-03-24T17:34:59.240 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31521.log 2026-03-24T17:34:59.240 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29713.log 2026-03-24T17:34:59.240 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.42409.log: /var/log/ceph/ceph-client.admin.31521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42409.log.gz 0.0% 2026-03-24T17:34:59.240 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.31521.log.gz 2026-03-24T17:34:59.240 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75210.log 2026-03-24T17:34:59.241 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45556.log 2026-03-24T17:34:59.241 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29713.log: /var/log/ceph/ceph-client.admin.75210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29713.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75210.log.gz 2026-03-24T17:34:59.241 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.241 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37924.log 2026-03-24T17:34:59.241 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87437.log 2026-03-24T17:34:59.242 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37924.log: /var/log/ceph/ceph-client.admin.45556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45556.log.gz 2026-03-24T17:34:59.242 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37924.log.gz 2026-03-24T17:34:59.242 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47608.log 2026-03-24T17:34:59.242 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55667.log 2026-03-24T17:34:59.242 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87437.log: /var/log/ceph/ceph-client.admin.47608.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47608.log.gz 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.87437.log.gz 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44586.log 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41450.log 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55667.log: /var/log/ceph/ceph-client.admin.44586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55667.log.gz 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.44586.log.gz 2026-03-24T17:34:59.243 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88845.log 2026-03-24T17:34:59.244 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66597.log 2026-03-24T17:34:59.244 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41450.log: /var/log/ceph/ceph-client.admin.88845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41450.log.gz 2026-03-24T17:34:59.244 INFO:teuthology.orchestra.run.vm01.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.88845.log.gz 2026-03-24T17:34:59.244 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47500.log 2026-03-24T17:34:59.244 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53906.log 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66597.log: /var/log/ceph/ceph-client.admin.47500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47500.log.gz 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.66597.log.gz 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54156.log 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46008.log 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53906.log: /var/log/ceph/ceph-client.admin.54156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53906.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54156.log.gz 2026-03-24T17:34:59.245 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.246 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25615.log 2026-03-24T17:34:59.246 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43484.log 2026-03-24T17:34:59.246 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46008.log: /var/log/ceph/ceph-client.admin.25615.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46008.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25615.log.gz 2026-03-24T17:34:59.246 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.246 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73155.log 2026-03-24T17:34:59.247 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78774.log 2026-03-24T17:34:59.247 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43484.log: /var/log/ceph/ceph-client.admin.73155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73155.log.gz 2026-03-24T17:34:59.247 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43484.log.gz 2026-03-24T17:34:59.247 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37861.log 2026-03-24T17:34:59.247 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43218.log 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78774.log: /var/log/ceph/ceph-client.admin.37861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78774.log.gz 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37861.log.gz 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74318.log 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57479.log 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43218.log: /var/log/ceph/ceph-client.admin.74318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43218.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74318.log.gz 2026-03-24T17:34:59.248 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.249 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87317.log 2026-03-24T17:34:59.249 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55646.log 2026-03-24T17:34:59.249 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57479.log: /var/log/ceph/ceph-client.admin.87317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57479.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87317.log.gz 2026-03-24T17:34:59.249 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.249 INFO:teuthology.orchestra.run.vm01.stderr: 30.5% -- replaced with /var/log/ceph/ceph-client.admin.47972.log.gz 2026-03-24T17:34:59.250 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30623.log 2026-03-24T17:34:59.250 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63216.log 2026-03-24T17:34:59.250 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55646.log: /var/log/ceph/ceph-client.admin.30623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55646.log.gz 2026-03-24T17:34:59.250 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30623.log.gz 2026-03-24T17:34:59.250 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53886.log 2026-03-24T17:34:59.251 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79004.log 2026-03-24T17:34:59.251 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63216.log: 0.0%/var/log/ceph/ceph-client.admin.53886.log: -- replaced with /var/log/ceph/ceph-client.admin.63216.log.gz 2026-03-24T17:34:59.251 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53886.log.gz 2026-03-24T17:34:59.251 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45879.log 2026-03-24T17:34:59.251 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64318.log 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79004.log: /var/log/ceph/ceph-client.admin.45879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79004.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45879.log.gz 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31956.log 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58363.log 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64318.log: /var/log/ceph/ceph-client.admin.31956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64318.log.gz 2026-03-24T17:34:59.252 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31956.log.gz 2026-03-24T17:34:59.253 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64807.log 2026-03-24T17:34:59.253 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79262.log 2026-03-24T17:34:59.253 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58363.log: /var/log/ceph/ceph-client.admin.64807.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64807.log.gz 2026-03-24T17:34:59.253 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.58363.log.gz 2026-03-24T17:34:59.253 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60628.log 2026-03-24T17:34:59.254 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79262.log.gz 2026-03-24T17:34:59.254 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47521.log 2026-03-24T17:34:59.254 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60628.log.gz 2026-03-24T17:34:59.254 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63892.log 2026-03-24T17:34:59.254 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69249.log 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47521.log: /var/log/ceph/ceph-client.admin.63892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47521.log.gz 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63892.log.gz 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27539.log 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49858.log 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69249.log: /var/log/ceph/ceph-client.admin.27539.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27539.log.gz 2026-03-24T17:34:59.255 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.69249.log.gz 2026-03-24T17:34:59.256 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32075.log 2026-03-24T17:34:59.256 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30057.log 2026-03-24T17:34:59.256 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49858.log: /var/log/ceph/ceph-client.admin.32075.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49858.log.gz 2026-03-24T17:34:59.256 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32075.log.gz 2026-03-24T17:34:59.256 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48223.log 2026-03-24T17:34:59.257 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89490.log 2026-03-24T17:34:59.257 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30057.log: /var/log/ceph/ceph-client.admin.48223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30057.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48223.log.gz 2026-03-24T17:34:59.257 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:34:59.257 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30452.log 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53268.log 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.89490.log: /var/log/ceph/ceph-client.admin.30452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89490.log.gz 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30452.log.gz 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51732.log 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72488.log 2026-03-24T17:34:59.258 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53268.log: /var/log/ceph/ceph-client.admin.51732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53268.log.gz 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51732.log.gz 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68393.log 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78073.log 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72488.log: /var/log/ceph/ceph-client.admin.68393.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68393.log.gz 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.72488.log.gz 2026-03-24T17:34:59.259 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48739.log 2026-03-24T17:34:59.260 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63414.log 2026-03-24T17:34:59.260 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78073.log: 0.0%/var/log/ceph/ceph-client.admin.48739.log: -- replaced with /var/log/ceph/ceph-client.admin.78073.log.gz 2026-03-24T17:34:59.260 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48739.log.gz 2026-03-24T17:34:59.260 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41866.log 2026-03-24T17:34:59.260 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36763.log 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63414.log: /var/log/ceph/ceph-client.admin.41866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63414.log.gz 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.41866.log.gz 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38562.log 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66932.log 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36763.log: /var/log/ceph/ceph-client.admin.38562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36763.log.gz 2026-03-24T17:34:59.261 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38562.log.gz 2026-03-24T17:34:59.262 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58427.log 2026-03-24T17:34:59.262 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70064.log 2026-03-24T17:34:59.262 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66932.log: /var/log/ceph/ceph-client.admin.58427.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58427.log.gz 2026-03-24T17:34:59.262 INFO:teuthology.orchestra.run.vm01.stderr: -- replaced with /var/log/ceph/ceph-client.admin.66932.log.gz 2026-03-24T17:34:59.262 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28535.log 2026-03-24T17:34:59.263 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65279.log 2026-03-24T17:34:59.263 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28535.log: /var/log/ceph/ceph-client.admin.70064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70064.log.gz 2026-03-24T17:34:59.263 INFO:teuthology.orchestra.run.vm01.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.28535.log.gz 2026-03-24T17:34:59.263 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81641.log 2026-03-24T17:34:59.263 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51048.log 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65279.log.gz 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81641.log.gz 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68136.log 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79154.log 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.51048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51048.log.gz 2026-03-24T17:34:59.264 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.68136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68136.log.gz 2026-03-24T17:34:59.265 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79541.log 2026-03-24T17:34:59.265 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37336.log 2026-03-24T17:34:59.265 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79154.log.gz 2026-03-24T17:34:59.265 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.79541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79541.log.gz 2026-03-24T17:34:59.265 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.2.log 2026-03-24T17:34:59.266 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38940.log 2026-03-24T17:34:59.266 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37336.log: 25.8%/var/log/ceph/ceph-osd.2.log: -- replaced with /var/log/ceph/ceph-client.admin.37336.log.gz 2026-03-24T17:34:59.266 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58859.log 2026-03-24T17:34:59.267 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38940.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38940.log.gz 2026-03-24T17:34:59.278 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28854.log 2026-03-24T17:34:59.278 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58859.log.gz 2026-03-24T17:34:59.278 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76073.log 2026-03-24T17:34:59.278 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.28854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28854.log.gz 2026-03-24T17:34:59.278 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78373.log 2026-03-24T17:34:59.279 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76073.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76073.log.gz 2026-03-24T17:34:59.279 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53499.log 2026-03-24T17:34:59.279 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78373.log.gz 2026-03-24T17:34:59.280 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45470.log 2026-03-24T17:34:59.280 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53499.log.gz 2026-03-24T17:34:59.280 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55968.log 2026-03-24T17:34:59.280 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45470.log.gz 2026-03-24T17:34:59.281 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43364.log 2026-03-24T17:34:59.281 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.55968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55968.log.gz 2026-03-24T17:34:59.281 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41536.log 2026-03-24T17:34:59.282 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43364.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43364.log.gz 2026-03-24T17:34:59.282 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39513.log 2026-03-24T17:34:59.282 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.41536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41536.log.gz 2026-03-24T17:34:59.283 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64027.log 2026-03-24T17:34:59.283 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.39513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39513.log.gz 2026-03-24T17:34:59.283 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60048.log 2026-03-24T17:34:59.284 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64027.log.gz 2026-03-24T17:34:59.284 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44282.log 2026-03-24T17:34:59.284 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60048.log.gz 2026-03-24T17:34:59.284 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56532.log 2026-03-24T17:34:59.285 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.44282.log: 26.3% -- replaced with /var/log/ceph/ceph-client.admin.44282.log.gz 2026-03-24T17:34:59.285 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29433.log 2026-03-24T17:34:59.286 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56532.log.gz 2026-03-24T17:34:59.286 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37462.log 2026-03-24T17:34:59.286 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29433.log.gz 2026-03-24T17:34:59.286 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72509.log 2026-03-24T17:34:59.287 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.37462.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.37462.log.gz 2026-03-24T17:34:59.287 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58621.log 2026-03-24T17:34:59.287 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.72509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72509.log.gz 2026-03-24T17:34:59.288 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50304.log 2026-03-24T17:34:59.288 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.58621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58621.log.gz 2026-03-24T17:34:59.288 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62801.log 2026-03-24T17:34:59.289 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50304.log.gz 2026-03-24T17:34:59.289 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40583.log 2026-03-24T17:34:59.289 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62801.log.gz 2026-03-24T17:34:59.289 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76562.log 2026-03-24T17:34:59.290 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40583.log.gz 2026-03-24T17:34:59.290 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65694.log 2026-03-24T17:34:59.290 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76562.log.gz 2026-03-24T17:34:59.291 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46029.log 2026-03-24T17:34:59.291 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.65694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65694.log.gz 2026-03-24T17:34:59.291 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59864.log 2026-03-24T17:34:59.292 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.46029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46029.log.gz 2026-03-24T17:34:59.292 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69378.log 2026-03-24T17:34:59.292 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.59864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59864.log.gz 2026-03-24T17:34:59.292 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57828.log 2026-03-24T17:34:59.293 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69378.log.gz 2026-03-24T17:34:59.293 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38163.log 2026-03-24T17:34:59.293 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.57828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57828.log.gz 2026-03-24T17:34:59.294 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74612.log 2026-03-24T17:34:59.294 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38163.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38163.log.gz 2026-03-24T17:34:59.294 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66476.log 2026-03-24T17:34:59.295 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74612.log.gz 2026-03-24T17:34:59.295 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76385.log 2026-03-24T17:34:59.295 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66476.log.gz 2026-03-24T17:34:59.295 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29906.log 2026-03-24T17:34:59.296 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.76385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76385.log.gz 2026-03-24T17:34:59.296 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61982.log 2026-03-24T17:34:59.296 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29906.log.gz 2026-03-24T17:34:59.297 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67362.log 2026-03-24T17:34:59.297 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61982.log.gz 2026-03-24T17:34:59.297 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53311.log 2026-03-24T17:34:59.298 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67362.log.gz 2026-03-24T17:34:59.298 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47828.log 2026-03-24T17:34:59.298 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53311.log.gz 2026-03-24T17:34:59.299 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31564.log 2026-03-24T17:34:59.299 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47828.log.gz 2026-03-24T17:34:59.299 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29412.log 2026-03-24T17:34:59.300 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31564.log.gz 2026-03-24T17:34:59.300 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88174.log 2026-03-24T17:34:59.300 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29412.log.gz 2026-03-24T17:34:59.300 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27638.log 2026-03-24T17:34:59.301 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88174.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88174.log.gz 2026-03-24T17:34:59.301 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45578.log 2026-03-24T17:34:59.301 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.27638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27638.log.gz 2026-03-24T17:34:59.302 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84414.log 2026-03-24T17:34:59.302 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45578.log.gz 2026-03-24T17:34:59.302 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33917.log 2026-03-24T17:34:59.302 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.84414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84414.log.gz 2026-03-24T17:34:59.303 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50159.log 2026-03-24T17:34:59.303 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.33917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33917.log.gz 2026-03-24T17:34:59.303 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88738.log 2026-03-24T17:34:59.304 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.50159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50159.log.gz 2026-03-24T17:34:59.304 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62219.log 2026-03-24T17:34:59.304 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.88738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88738.log.gz 2026-03-24T17:34:59.305 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48263.log 2026-03-24T17:34:59.305 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.62219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62219.log.gz 2026-03-24T17:34:59.305 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30571.log 2026-03-24T17:34:59.305 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.48263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48263.log.gz 2026-03-24T17:34:59.306 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40129.log 2026-03-24T17:34:59.306 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.30571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30571.log.gz 2026-03-24T17:34:59.306 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32484.log 2026-03-24T17:34:59.307 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40129.log.gz 2026-03-24T17:34:59.307 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81230.log 2026-03-24T17:34:59.307 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32484.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32484.log.gz 2026-03-24T17:34:59.308 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45492.log 2026-03-24T17:34:59.308 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.81230.log: 53.6% -- replaced with /var/log/ceph/ceph-client.admin.81230.log.gz 2026-03-24T17:34:59.308 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60671.log 2026-03-24T17:34:59.309 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.45492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45492.log.gz 2026-03-24T17:34:59.309 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70021.log 2026-03-24T17:34:59.309 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60671.log.gz 2026-03-24T17:34:59.310 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.a.log 2026-03-24T17:34:59.310 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.70021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70021.log.gz 2026-03-24T17:34:59.310 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75772.log 2026-03-24T17:34:59.321 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-mon.a.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67985.log 2026-03-24T17:34:59.322 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.75772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75772.log.gz 2026-03-24T17:34:59.329 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83117.log 2026-03-24T17:34:59.330 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.67985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67985.log.gz 2026-03-24T17:34:59.341 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83939.log 2026-03-24T17:34:59.342 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83117.log.gz 2026-03-24T17:34:59.353 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36149.log 2026-03-24T17:34:59.354 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.83939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83939.log.gz 2026-03-24T17:34:59.365 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66146.log 2026-03-24T17:34:59.366 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.36149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36149.log.gz 2026-03-24T17:34:59.369 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52946.log 2026-03-24T17:34:59.369 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.66146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66146.log.gz 2026-03-24T17:34:59.377 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49729.log 2026-03-24T17:34:59.378 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.52946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52946.log.gz 2026-03-24T17:34:59.385 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63946.log 2026-03-24T17:34:59.386 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.49729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49729.log.gz 2026-03-24T17:34:59.393 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74204.log 2026-03-24T17:34:59.394 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.63946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63946.log.gz 2026-03-24T17:34:59.401 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69786.log 2026-03-24T17:34:59.402 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.74204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74204.log.gz 2026-03-24T17:34:59.409 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80124.log 2026-03-24T17:34:59.410 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.69786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69786.log.gz 2026-03-24T17:34:59.425 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29992.log 2026-03-24T17:34:59.426 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.80124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80124.log.gz 2026-03-24T17:34:59.433 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85673.log 2026-03-24T17:34:59.434 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29992.log.gz 2026-03-24T17:34:59.441 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32450.log 2026-03-24T17:34:59.442 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.85673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85673.log.gz 2026-03-24T17:34:59.449 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64593.log 2026-03-24T17:34:59.457 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.32450.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32450.log.gz 2026-03-24T17:34:59.458 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78910.log 2026-03-24T17:34:59.465 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.64593.log: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.64593.log.gz 2026-03-24T17:34:59.473 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47306.log 2026-03-24T17:34:59.474 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.78910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78910.log.gz 2026-03-24T17:34:59.481 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61402.log 2026-03-24T17:34:59.482 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.47306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47306.log.gz 2026-03-24T17:34:59.489 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60198.log 2026-03-24T17:34:59.490 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.61402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61402.log.gz 2026-03-24T17:34:59.501 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29262.log 2026-03-24T17:34:59.502 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.60198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60198.log.gz 2026-03-24T17:34:59.515 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38184.log 2026-03-24T17:34:59.515 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.29262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29262.log.gz 2026-03-24T17:34:59.521 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31371.log 2026-03-24T17:34:59.529 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.38184.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56618.log 2026-03-24T17:34:59.530 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.31371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31371.log.gz 2026-03-24T17:34:59.533 INFO:teuthology.orchestra.run.vm01.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38184.log.gz 2026-03-24T17:34:59.545 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54339.log 2026-03-24T17:34:59.546 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.56618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56618.log.gz 2026-03-24T17:34:59.553 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87259.log 2026-03-24T17:34:59.554 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.54339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54339.log.gz 2026-03-24T17:34:59.561 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40222.log 2026-03-24T17:34:59.562 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.87259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87259.log.gz 2026-03-24T17:34:59.569 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34637.log 2026-03-24T17:34:59.570 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.40222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40222.log.gz 2026-03-24T17:34:59.577 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53338.log 2026-03-24T17:34:59.578 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.34637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34637.log.gz 2026-03-24T17:34:59.589 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43563.log 2026-03-24T17:34:59.590 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.53338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53338.log.gz 2026-03-24T17:34:59.593 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86837.log 2026-03-24T17:34:59.605 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.43563.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.43563.log.gz 2026-03-24T17:34:59.610 INFO:teuthology.orchestra.run.vm01.stderr:/var/log/ceph/ceph-client.admin.86837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86837.log.gz 2026-03-24T17:35:00.081 INFO:teuthology.orchestra.run.vm01.stderr: 92.0% -- replaced with /var/log/ceph/ceph-mon.a.log.gz 2026-03-24T17:35:07.492 INFO:teuthology.orchestra.run.vm01.stderr: 94.0% -- replaced with /var/log/ceph/ceph-osd.2.log.gz 2026-03-24T17:35:09.623 INFO:teuthology.orchestra.run.vm01.stderr: 93.8% -- replaced with /var/log/ceph/ceph-osd.0.log.gz 2026-03-24T17:35:12.685 INFO:teuthology.orchestra.run.vm01.stderr: 94.1% -- replaced with /var/log/ceph/ceph-osd.1.log.gz 2026-03-24T17:35:12.686 INFO:teuthology.orchestra.run.vm01.stderr: 2026-03-24T17:35:12.686 INFO:teuthology.orchestra.run.vm01.stderr:real 0m14.351s 2026-03-24T17:35:12.686 INFO:teuthology.orchestra.run.vm01.stderr:user 0m32.990s 2026-03-24T17:35:12.686 INFO:teuthology.orchestra.run.vm01.stderr:sys 0m2.552s 2026-03-24T17:35:12.686 INFO:tasks.ceph:Archiving logs... 2026-03-24T17:35:12.686 DEBUG:teuthology.misc:Transferring archived files from vm01:/var/log/ceph to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621/remote/vm01/log 2026-03-24T17:35:12.686 DEBUG:teuthology.orchestra.run.vm01:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-24T17:35:14.397 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-24T17:35:14.399 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-24T17:35:14.399 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-24T17:35:14.423 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-24T17:35:14.423 DEBUG:teuthology.orchestra.run.vm01:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-24T17:35:14.496 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:14.631 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:14.631 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:14.732 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:14.732 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T17:35:14.732 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T17:35:14.732 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:14.744 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:14.744 INFO:teuthology.orchestra.run.vm01.stdout: ceph* 2026-03-24T17:35:14.912 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T17:35:14.912 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-24T17:35:14.959 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126150 files and directories currently installed.) 2026-03-24T17:35:14.961 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:15.988 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:16.023 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:16.181 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:16.182 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:16.300 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:16.300 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T17:35:16.300 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-24T17:35:16.301 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:16.309 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:16.309 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-cephadm* cephadm* 2026-03-24T17:35:16.458 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 2 to remove and 44 not upgraded. 2026-03-24T17:35:16.458 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2177 kB disk space will be freed. 2026-03-24T17:35:16.493 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126148 files and directories currently installed.) 2026-03-24T17:35:16.495 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:16.507 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-cephadm, directory '/usr/share/ceph/mgr/cephadm/services' not empty so not removed 2026-03-24T17:35:16.515 INFO:teuthology.orchestra.run.vm01.stdout:Removing cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:16.545 INFO:teuthology.orchestra.run.vm01.stdout:Looking for files to backup/remove ... 2026-03-24T17:35:16.546 INFO:teuthology.orchestra.run.vm01.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-24T17:35:16.548 INFO:teuthology.orchestra.run.vm01.stdout:Removing user `cephadm' ... 2026-03-24T17:35:16.548 INFO:teuthology.orchestra.run.vm01.stdout:Warning: group `nogroup' has no more members. 2026-03-24T17:35:16.574 INFO:teuthology.orchestra.run.vm01.stdout:Done. 2026-03-24T17:35:16.596 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:16.687 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-24T17:35:16.689 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:17.621 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:17.657 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:17.813 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:17.814 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:17.923 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:17.923 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T17:35:17.924 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-24T17:35:17.924 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:17.932 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:17.932 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mds* 2026-03-24T17:35:18.085 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T17:35:18.085 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 6851 kB disk space will be freed. 2026-03-24T17:35:18.121 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-24T17:35:18.123 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:18.545 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:18.633 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-24T17:35:18.635 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:20.018 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:20.053 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:20.210 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:20.211 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:20.325 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:20.325 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-cheroot python3-cherrypy3 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-psutil 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: python3-threadpoolctl python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev 2026-03-24T17:35:20.326 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:20.334 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:20.334 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-24T17:35:20.334 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-k8sevents* 2026-03-24T17:35:20.487 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 4 to remove and 44 not upgraded. 2026-03-24T17:35:20.487 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 219 MB disk space will be freed. 2026-03-24T17:35:20.523 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-24T17:35:20.524 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:20.536 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:20.548 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-diskprediction-local, directory '/usr/share/ceph/mgr/diskprediction_local' not empty so not removed 2026-03-24T17:35:20.557 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:20.612 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/services/auth' not empty so not removed 2026-03-24T17:35:20.612 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/plugins' not empty so not removed 2026-03-24T17:35:20.612 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/model' not empty so not removed 2026-03-24T17:35:20.612 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/controllers' not empty so not removed 2026-03-24T17:35:20.612 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/api' not empty so not removed 2026-03-24T17:35:20.623 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:21.079 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124272 files and directories currently installed.) 2026-03-24T17:35:21.081 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:21.464 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr, directory '/var/lib/ceph/mgr' not empty so not removed 2026-03-24T17:35:22.431 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:22.466 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:22.625 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:22.625 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:22.736 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:22.745 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:22.745 INFO:teuthology.orchestra.run.vm01.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-24T17:35:22.894 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 44 not upgraded. 2026-03-24T17:35:22.894 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 732 MB disk space will be freed. 2026-03-24T17:35:22.930 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124271 files and directories currently installed.) 2026-03-24T17:35:22.932 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:22.993 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:23.401 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:23.802 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:24.216 INFO:teuthology.orchestra.run.vm01.stdout:Removing radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:24.616 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:24.640 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:25.049 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:25.082 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T17:35:25.156 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123780 files and directories currently installed.) 2026-03-24T17:35:25.158 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:25.744 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:26.145 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mon, directory '/var/lib/ceph/mon' not empty so not removed 2026-03-24T17:35:26.154 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:26.558 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:27.086 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:27.502 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-osd, directory '/var/lib/ceph/osd' not empty so not removed 2026-03-24T17:35:28.510 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:28.546 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:28.714 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:28.714 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:28.832 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:28.832 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:28.832 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:28.833 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:28.841 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:28.841 INFO:teuthology.orchestra.run.vm01.stdout: ceph-fuse* 2026-03-24T17:35:28.998 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T17:35:28.998 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2932 kB disk space will be freed. 2026-03-24T17:35:29.035 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123764 files and directories currently installed.) 2026-03-24T17:35:29.037 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:29.413 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:29.502 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-24T17:35:29.504 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:30.864 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:30.900 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:31.065 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:31.065 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T17:35:31.178 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T17:35:31.179 INFO:teuthology.orchestra.run.vm01.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:31.179 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:31.179 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:31.195 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:31.195 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:31.229 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:31.404 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:31.405 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:31.516 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:31.532 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:31.532 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:31.567 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:31.732 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:31.732 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout:Package 'radosgw' is not installed, so not removed 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T17:35:31.845 INFO:teuthology.orchestra.run.vm01.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T17:35:31.846 INFO:teuthology.orchestra.run.vm01.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T17:35:31.846 INFO:teuthology.orchestra.run.vm01.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T17:35:31.846 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:31.846 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:31.864 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:31.864 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:31.898 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:32.071 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:32.071 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:32.187 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:32.197 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:32.197 INFO:teuthology.orchestra.run.vm01.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-24T17:35:32.349 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 3 to remove and 44 not upgraded. 2026-03-24T17:35:32.349 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2086 kB disk space will be freed. 2026-03-24T17:35:32.386 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-24T17:35:32.388 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:32.399 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:32.409 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:33.436 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:33.477 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:33.660 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:33.661 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:33.784 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:33.801 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:33.801 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:33.835 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:34.002 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:34.003 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:34.133 INFO:teuthology.orchestra.run.vm01.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-24T17:35:34.133 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:34.133 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:34.133 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:34.134 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:34.152 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:34.152 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:34.188 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:34.402 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:34.403 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:34.551 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:34.551 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:34.551 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:34.552 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:34.566 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:34.566 INFO:teuthology.orchestra.run.vm01.stdout: python3-rbd* 2026-03-24T17:35:34.753 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T17:35:34.753 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 1205 kB disk space will be freed. 2026-03-24T17:35:34.797 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123731 files and directories currently installed.) 2026-03-24T17:35:34.799 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:35.872 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:35.907 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:36.070 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:36.070 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:36.184 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:36.193 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:36.193 INFO:teuthology.orchestra.run.vm01.stdout: libcephfs-daemon* libcephfs-dev* libcephfs2* 2026-03-24T17:35:36.361 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 3 to remove and 44 not upgraded. 2026-03-24T17:35:36.361 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 2851 kB disk space will be freed. 2026-03-24T17:35:36.405 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123723 files and directories currently installed.) 2026-03-24T17:35:36.408 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:36.420 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:36.431 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:36.458 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T17:35:37.496 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:37.530 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:37.687 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:37.687 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout: socat xmlstarlet 2026-03-24T17:35:37.796 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:37.812 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:37.812 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:37.845 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:37.990 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:37.990 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:38.097 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:38.097 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:38.097 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T17:35:38.097 INFO:teuthology.orchestra.run.vm01.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T17:35:38.097 INFO:teuthology.orchestra.run.vm01.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:38.098 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:38.106 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:38.106 INFO:teuthology.orchestra.run.vm01.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-24T17:35:38.106 INFO:teuthology.orchestra.run.vm01.stdout: qemu-block-extra* rbd-fuse* 2026-03-24T17:35:38.259 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 7 to remove and 44 not upgraded. 2026-03-24T17:35:38.259 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 59.2 MB disk space will be freed. 2026-03-24T17:35:38.296 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123701 files and directories currently installed.) 2026-03-24T17:35:38.298 INFO:teuthology.orchestra.run.vm01.stdout:Removing rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.310 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.322 INFO:teuthology.orchestra.run.vm01.stdout:Removing libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.332 INFO:teuthology.orchestra.run.vm01.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-24T17:35:38.748 INFO:teuthology.orchestra.run.vm01.stdout:Removing librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.761 INFO:teuthology.orchestra.run.vm01.stdout:Removing librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.795 INFO:teuthology.orchestra.run.vm01.stdout:Removing librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:38.870 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:38.958 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T17:35:39.040 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-24T17:35:39.043 INFO:teuthology.orchestra.run.vm01.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-24T17:35:40.408 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:40.442 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:40.583 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:40.584 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout:Package 'librbd1' is not installed, so not removed 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T17:35:40.690 INFO:teuthology.orchestra.run.vm01.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:40.691 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:40.706 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:40.707 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:40.742 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:40.886 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:40.886 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T17:35:40.988 INFO:teuthology.orchestra.run.vm01.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:40.989 INFO:teuthology.orchestra.run.vm01.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T17:35:41.004 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T17:35:41.004 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:41.006 DEBUG:teuthology.orchestra.run.vm01:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-24T17:35:41.062 DEBUG:teuthology.orchestra.run.vm01:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-24T17:35:41.139 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:41.298 INFO:teuthology.orchestra.run.vm01.stdout:Building dependency tree... 2026-03-24T17:35:41.299 INFO:teuthology.orchestra.run.vm01.stdout:Reading state information... 2026-03-24T17:35:41.422 INFO:teuthology.orchestra.run.vm01.stdout:The following packages will be REMOVED: 2026-03-24T17:35:41.422 INFO:teuthology.orchestra.run.vm01.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T17:35:41.422 INFO:teuthology.orchestra.run.vm01.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T17:35:41.422 INFO:teuthology.orchestra.run.vm01.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T17:35:41.422 INFO:teuthology.orchestra.run.vm01.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T17:35:41.423 INFO:teuthology.orchestra.run.vm01.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T17:35:41.568 INFO:teuthology.orchestra.run.vm01.stdout:0 upgraded, 0 newly installed, 64 to remove and 44 not upgraded. 2026-03-24T17:35:41.568 INFO:teuthology.orchestra.run.vm01.stdout:After this operation, 96.8 MB disk space will be freed. 2026-03-24T17:35:41.605 INFO:teuthology.orchestra.run.vm01.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-24T17:35:41.607 INFO:teuthology.orchestra.run.vm01.stdout:Removing ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/volumes/fs/operations/versions' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/test_orchestrator' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telemetry' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telegraf' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/status' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/stats/fs' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/snap_schedule/fs' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/selftest' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rgw' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rbd_support' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/prometheus' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/progress' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/pg_autoscaler' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_support' not empty so not removed 2026-03-24T17:35:41.621 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_perf_query' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/orchestrator' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/nfs' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/mirroring/fs/dir_map' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/localpool' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/iostat' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/insights' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/influx' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/devicehealth' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/crash' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/balancer' not empty so not removed 2026-03-24T17:35:41.622 INFO:teuthology.orchestra.run.vm01.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/alerts' not empty so not removed 2026-03-24T17:35:41.637 INFO:teuthology.orchestra.run.vm01.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-24T17:35:41.647 INFO:teuthology.orchestra.run.vm01.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-24T17:35:41.657 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-24T17:35:41.669 INFO:teuthology.orchestra.run.vm01.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-24T17:35:41.678 INFO:teuthology.orchestra.run.vm01.stdout:Removing libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:41.688 INFO:teuthology.orchestra.run.vm01.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T17:35:41.698 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T17:35:41.708 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T17:35:41.718 INFO:teuthology.orchestra.run.vm01.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T17:35:41.739 INFO:teuthology.orchestra.run.vm01.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T17:35:41.751 INFO:teuthology.orchestra.run.vm01.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T17:35:41.761 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T17:35:41.772 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T17:35:41.781 INFO:teuthology.orchestra.run.vm01.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T17:35:41.791 INFO:teuthology.orchestra.run.vm01.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T17:35:41.800 INFO:teuthology.orchestra.run.vm01.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-24T17:35:41.810 INFO:teuthology.orchestra.run.vm01.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T17:35:41.820 INFO:teuthology.orchestra.run.vm01.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T17:35:41.830 INFO:teuthology.orchestra.run.vm01.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-24T17:35:41.839 INFO:teuthology.orchestra.run.vm01.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T17:35:41.849 INFO:teuthology.orchestra.run.vm01.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T17:35:41.858 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T17:35:41.868 INFO:teuthology.orchestra.run.vm01.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-24T17:35:41.878 INFO:teuthology.orchestra.run.vm01.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T17:35:41.887 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-24T17:35:41.896 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: deferring update (trigger activated) 2026-03-24T17:35:41.905 INFO:teuthology.orchestra.run.vm01.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-24T17:35:41.920 INFO:teuthology.orchestra.run.vm01.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-24T17:35:41.931 INFO:teuthology.orchestra.run.vm01.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T17:35:42.297 INFO:teuthology.orchestra.run.vm01.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T17:35:42.309 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T17:35:42.364 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T17:35:42.620 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-24T17:35:42.671 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-24T17:35:42.717 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:42.769 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T17:35:42.825 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-24T17:35:42.899 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T17:35:42.950 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-24T17:35:42.997 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-24T17:35:43.045 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-24T17:35:43.092 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-24T17:35:43.139 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-24T17:35:43.186 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-24T17:35:43.232 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T17:35:43.353 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T17:35:43.411 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-24T17:35:43.458 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-24T17:35:43.504 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-24T17:35:43.554 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T17:35:43.603 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-24T17:35:43.651 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T17:35:43.701 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-rsa (4.8-1) ... 2026-03-24T17:35:43.751 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-24T17:35:43.803 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T17:35:43.816 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-24T17:35:43.864 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T17:35:43.911 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T17:35:43.961 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-24T17:35:44.012 INFO:teuthology.orchestra.run.vm01.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-24T17:35:44.062 INFO:teuthology.orchestra.run.vm01.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-24T17:35:44.080 INFO:teuthology.orchestra.run.vm01.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T17:35:44.501 INFO:teuthology.orchestra.run.vm01.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-24T17:35:44.513 INFO:teuthology.orchestra.run.vm01.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-24T17:35:44.548 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T17:35:44.559 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T17:35:44.603 INFO:teuthology.orchestra.run.vm01.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-24T17:35:44.619 INFO:teuthology.orchestra.run.vm01.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-171-generic 2026-03-24T17:35:49.286 INFO:teuthology.orchestra.run.vm01.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T17:35:49.289 DEBUG:teuthology.parallel:result is None 2026-03-24T17:35:49.289 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm01.local 2026-03-24T17:35:49.289 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-24T17:35:49.337 DEBUG:teuthology.orchestra.run.vm01:> sudo apt-get update 2026-03-24T17:35:49.439 INFO:teuthology.orchestra.run.vm01.stdout:Hit:1 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-24T17:35:49.442 INFO:teuthology.orchestra.run.vm01.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-24T17:35:49.449 INFO:teuthology.orchestra.run.vm01.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-24T17:35:49.617 INFO:teuthology.orchestra.run.vm01.stdout:Hit:4 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-24T17:35:50.461 INFO:teuthology.orchestra.run.vm01.stdout:Reading package lists... 2026-03-24T17:35:50.474 DEBUG:teuthology.parallel:result is None 2026-03-24T17:35:50.474 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-24T17:35:50.476 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-24T17:35:50.476 DEBUG:teuthology.orchestra.run.vm01:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: remote refid st t when poll reach delay offset jitter 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:============================================================================== 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:-server1a.meinbe 124.216.164.14 2 u 34 128 377 25.062 +0.074 0.035 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:-trick.infra.9rc 178.215.228.24 3 u 235 256 377 28.352 -4.116 0.174 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:+de.relay.mahi.b 109.190.177.205 2 u 27 128 377 21.168 -0.044 0.053 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:+vps-fra2.orlean 169.254.169.254 4 u 30 128 377 20.954 +0.096 17.487 2026-03-24T17:35:50.834 INFO:teuthology.orchestra.run.vm01.stdout:*mail.gunnarhofm 192.53.103.108 2 u 22 128 377 25.023 +0.015 0.102 2026-03-24T17:35:50.834 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-24T17:35:50.836 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-24T17:35:50.837 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-24T17:35:50.839 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-24T17:35:50.841 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-24T17:35:50.843 INFO:teuthology.task.internal:Duration was 2786.254990 seconds 2026-03-24T17:35:50.843 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-24T17:35:50.845 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-24T17:35:50.845 DEBUG:teuthology.orchestra.run.vm01:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-24T17:35:50.867 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-24T17:35:50.867 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm01.local 2026-03-24T17:35:50.867 DEBUG:teuthology.orchestra.run.vm01:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-24T17:35:50.917 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-24T17:35:50.917 DEBUG:teuthology.orchestra.run.vm01:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-24T17:35:50.981 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-24T17:35:50.981 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-24T17:35:51.033 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-24T17:35:51.033 INFO:teuthology.orchestra.run.vm01.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-24T17:35:51.033 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-24T17:35:51.033 INFO:teuthology.orchestra.run.vm01.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-24T17:35:51.033 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-24T17:35:51.037 INFO:teuthology.orchestra.run.vm01.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-24T17:35:51.038 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-24T17:35:51.040 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-24T17:35:51.040 DEBUG:teuthology.orchestra.run.vm01:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-24T17:35:51.088 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-24T17:35:51.091 DEBUG:teuthology.orchestra.run.vm01:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-24T17:35:51.137 INFO:teuthology.orchestra.run.vm01.stdout:kernel.core_pattern = core 2026-03-24T17:35:51.145 DEBUG:teuthology.orchestra.run.vm01:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-24T17:35:51.188 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T17:35:51.188 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-24T17:35:51.190 INFO:teuthology.task.internal:Transferring archived files... 2026-03-24T17:35:51.190 DEBUG:teuthology.misc:Transferring archived files from vm01:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3621/remote/vm01 2026-03-24T17:35:51.190 DEBUG:teuthology.orchestra.run.vm01:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-24T17:35:51.237 INFO:teuthology.task.internal:Removing archive directory... 2026-03-24T17:35:51.237 DEBUG:teuthology.orchestra.run.vm01:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-24T17:35:51.284 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-24T17:35:51.287 INFO:teuthology.task.internal:Not uploading archives. 2026-03-24T17:35:51.287 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-24T17:35:51.289 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-24T17:35:51.289 DEBUG:teuthology.orchestra.run.vm01:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-24T17:35:51.328 INFO:teuthology.orchestra.run.vm01.stdout: 258077 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 24 17:35 /home/ubuntu/cephtest 2026-03-24T17:35:51.329 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-24T17:35:51.335 INFO:teuthology.run:Summary data: description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-zlib supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} duration: 2786.254989862442 flavor: default owner: kyr success: true 2026-03-24T17:35:51.335 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-24T17:35:51.356 INFO:teuthology.run:pass