2026-03-24T10:45:05.757 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-24T10:45:05.761 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-24T10:45:05.780 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589 branch: tentacle description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-lz4 supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} email: null first_in_suite: false flavor: default job_id: '3589' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps no_nested_subset: false os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: client: rbd default format: 1 global: mon client directed command retry: 5 mon warn on pool no app: false ms inject socket failures: 5000 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: bluestore block size: 96636764160 bluestore compression algorithm: lz4 bluestore compression mode: aggressive bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd shutdown pgref assert: true flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - \(OSD_SLOW_PING_TIME sha1: 70f8415b300f041766fa27faf7d5472699e32388 ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log global: osd crush chooseleaf type: 0 osd pool default pg num: 128 osd pool default pgp num: 128 osd pool default size: 2 mon: {} cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 70f8415b300f041766fa27faf7d5472699e32388 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 3051 sha1: 70f8415b300f041766fa27faf7d5472699e32388 sleep_before_teardown: 0 subset: 1/128 suite: rbd suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm05.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAVJJjIoDa2cTjdkjKgs/H+4qhgK0vF9IZI84LLvKo+3+ZBDA+acNPK9sl3SYRY6paO1CwgelHH/nU7wFhY4+34= tasks: - install: null - ceph: null - workunit: clients: client.0: - rbd/cli_generic.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-20_22:04:26 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871 2026-03-24T10:45:05.780 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-24T10:45:05.780 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-24T10:45:05.780 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-24T10:45:05.781 INFO:teuthology.task.internal:Checking packages... 2026-03-24T10:45:05.781 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash '70f8415b300f041766fa27faf7d5472699e32388' 2026-03-24T10:45:05.781 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-24T10:45:05.781 INFO:teuthology.packaging:ref: None 2026-03-24T10:45:05.781 INFO:teuthology.packaging:tag: None 2026-03-24T10:45:05.781 INFO:teuthology.packaging:branch: tentacle 2026-03-24T10:45:05.781 INFO:teuthology.packaging:sha1: 70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T10:45:05.781 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=tentacle 2026-03-24T10:45:06.576 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-714-g147f7c6a-1jammy 2026-03-24T10:45:06.577 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-24T10:45:06.577 INFO:teuthology.task.internal:no buildpackages task found 2026-03-24T10:45:06.577 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-24T10:45:06.578 INFO:teuthology.task.internal:Saving configuration 2026-03-24T10:45:06.583 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-24T10:45:06.584 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-24T10:45:06.590 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm05.local', 'description': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-24 10:44:27.906676', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:05', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAVJJjIoDa2cTjdkjKgs/H+4qhgK0vF9IZI84LLvKo+3+ZBDA+acNPK9sl3SYRY6paO1CwgelHH/nU7wFhY4+34='} 2026-03-24T10:45:06.590 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-24T10:45:06.591 INFO:teuthology.task.internal:roles: ubuntu@vm05.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-24T10:45:06.591 INFO:teuthology.run_tasks:Running task console_log... 2026-03-24T10:45:06.596 DEBUG:teuthology.task.console_log:vm05 does not support IPMI; excluding 2026-03-24T10:45:06.596 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f7f97f088b0>, signals=[15]) 2026-03-24T10:45:06.596 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-24T10:45:06.597 INFO:teuthology.task.internal:Opening connections... 2026-03-24T10:45:06.597 DEBUG:teuthology.task.internal:connecting to ubuntu@vm05.local 2026-03-24T10:45:06.597 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-24T10:45:06.653 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-24T10:45:06.655 DEBUG:teuthology.orchestra.run.vm05:> uname -m 2026-03-24T10:45:06.773 INFO:teuthology.orchestra.run.vm05.stdout:x86_64 2026-03-24T10:45:06.773 DEBUG:teuthology.orchestra.run.vm05:> cat /etc/os-release 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:NAME="Ubuntu" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_ID="22.04" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_CODENAME=jammy 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:ID=ubuntu 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:ID_LIKE=debian 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-24T10:45:06.817 INFO:teuthology.orchestra.run.vm05.stdout:UBUNTU_CODENAME=jammy 2026-03-24T10:45:06.818 INFO:teuthology.lock.ops:Updating vm05.local on lock server 2026-03-24T10:45:06.822 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-24T10:45:06.823 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-24T10:45:06.824 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-24T10:45:06.824 DEBUG:teuthology.orchestra.run.vm05:> test '!' -e /home/ubuntu/cephtest 2026-03-24T10:45:06.861 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-24T10:45:06.862 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-24T10:45:06.862 DEBUG:teuthology.orchestra.run.vm05:> test -z $(ls -A /var/lib/ceph) 2026-03-24T10:45:06.906 INFO:teuthology.orchestra.run.vm05.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-24T10:45:06.906 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-24T10:45:06.913 DEBUG:teuthology.orchestra.run.vm05:> test -e /ceph-qa-ready 2026-03-24T10:45:06.949 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:45:07.174 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-24T10:45:07.175 INFO:teuthology.task.internal:Creating test directory... 2026-03-24T10:45:07.175 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-24T10:45:07.178 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-24T10:45:07.179 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-24T10:45:07.180 INFO:teuthology.task.internal:Creating archive directory... 2026-03-24T10:45:07.180 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-24T10:45:07.223 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-24T10:45:07.224 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-24T10:45:07.224 DEBUG:teuthology.orchestra.run.vm05:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-24T10:45:07.265 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:45:07.265 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-24T10:45:07.315 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-24T10:45:07.319 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-24T10:45:07.320 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-24T10:45:07.321 INFO:teuthology.task.internal:Configuring sudo... 2026-03-24T10:45:07.321 DEBUG:teuthology.orchestra.run.vm05:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-24T10:45:07.369 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-24T10:45:07.371 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-24T10:45:07.372 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-24T10:45:07.413 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-24T10:45:07.457 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-24T10:45:07.501 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:45:07.501 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-24T10:45:07.550 DEBUG:teuthology.orchestra.run.vm05:> sudo service rsyslog restart 2026-03-24T10:45:07.607 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-24T10:45:07.608 INFO:teuthology.task.internal:Starting timer... 2026-03-24T10:45:07.608 INFO:teuthology.run_tasks:Running task pcp... 2026-03-24T10:45:07.611 INFO:teuthology.run_tasks:Running task selinux... 2026-03-24T10:45:07.613 INFO:teuthology.task.selinux:Excluding vm05: VMs are not yet supported 2026-03-24T10:45:07.613 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-24T10:45:07.613 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-24T10:45:07.613 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-24T10:45:07.613 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-24T10:45:07.614 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-24T10:45:07.615 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/kshtsk/ceph-cm-ansible.git 2026-03-24T10:45:07.616 INFO:teuthology.repo_utils:Fetching github.com_kshtsk_ceph-cm-ansible_main from origin 2026-03-24T10:45:08.093 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-24T10:45:08.099 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-24T10:45:08.100 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventoryrfclq8tf --limit vm05.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-24T10:47:16.844 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm05.local')] 2026-03-24T10:47:16.845 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm05.local' 2026-03-24T10:47:16.845 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-24T10:47:16.908 DEBUG:teuthology.orchestra.run.vm05:> true 2026-03-24T10:47:17.153 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm05.local' 2026-03-24T10:47:17.153 INFO:teuthology.run_tasks:Running task clock... 2026-03-24T10:47:17.155 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-24T10:47:17.156 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-24T10:47:17.156 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Command line: ntpd -gq 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: ---------------------------------------------------- 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: ntp-4 is maintained by Network Time Foundation, 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: corporation. Support and training for ntp-4 are 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: available at https://www.nwtime.org/support 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: ---------------------------------------------------- 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: proto: precision = 0.040 usec (-24) 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: basedate set to 2022-02-04 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: gps base set to 2022-02-06 (week 2196) 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-24T10:47:17.214 INFO:teuthology.orchestra.run.vm05.stderr:24 Mar 10:47:17 ntpd[16225]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 87 days ago 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen and drop on 0 v6wildcard [::]:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen normally on 2 lo 127.0.0.1:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen normally on 3 ens3 192.168.123.105:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen normally on 4 lo [::1]:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:5%2]:123 2026-03-24T10:47:17.215 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:17 ntpd[16225]: Listening on routing socket on fd #22 for interface updates 2026-03-24T10:47:18.214 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:18 ntpd[16225]: Soliciting pool server 172.236.195.26 2026-03-24T10:47:19.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:19 ntpd[16225]: Soliciting pool server 128.140.109.119 2026-03-24T10:47:19.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:19 ntpd[16225]: Soliciting pool server 5.9.193.27 2026-03-24T10:47:20.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:20 ntpd[16225]: Soliciting pool server 178.215.228.24 2026-03-24T10:47:20.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:20 ntpd[16225]: Soliciting pool server 85.10.240.253 2026-03-24T10:47:20.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:20 ntpd[16225]: Soliciting pool server 93.241.86.156 2026-03-24T10:47:21.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:21 ntpd[16225]: Soliciting pool server 51.75.67.47 2026-03-24T10:47:21.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:21 ntpd[16225]: Soliciting pool server 144.76.43.40 2026-03-24T10:47:21.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:21 ntpd[16225]: Soliciting pool server 139.162.152.20 2026-03-24T10:47:21.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:21 ntpd[16225]: Soliciting pool server 141.84.43.75 2026-03-24T10:47:22.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:22 ntpd[16225]: Soliciting pool server 185.248.188.98 2026-03-24T10:47:22.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:22 ntpd[16225]: Soliciting pool server 185.207.105.38 2026-03-24T10:47:22.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:22 ntpd[16225]: Soliciting pool server 139.162.187.236 2026-03-24T10:47:22.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:22 ntpd[16225]: Soliciting pool server 185.125.190.57 2026-03-24T10:47:23.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:23 ntpd[16225]: Soliciting pool server 185.125.190.56 2026-03-24T10:47:23.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:23 ntpd[16225]: Soliciting pool server 212.132.97.26 2026-03-24T10:47:23.213 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:23 ntpd[16225]: Soliciting pool server 116.203.96.227 2026-03-24T10:47:25.236 INFO:teuthology.orchestra.run.vm05.stdout:24 Mar 10:47:25 ntpd[16225]: ntpd: time slew -0.001416 s 2026-03-24T10:47:25.237 INFO:teuthology.orchestra.run.vm05.stdout:ntpd: time slew -0.001416s 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: remote refid st t when poll reach delay offset jitter 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout:============================================================================== 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T10:47:25.259 INFO:teuthology.orchestra.run.vm05.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T10:47:25.259 INFO:teuthology.run_tasks:Running task install... 2026-03-24T10:47:25.261 DEBUG:teuthology.task.install:project ceph 2026-03-24T10:47:25.261 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-24T10:47:25.261 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-24T10:47:25.261 INFO:teuthology.task.install:Using flavor: default 2026-03-24T10:47:25.264 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-24T10:47:25.264 INFO:teuthology.task.install:extra packages: [] 2026-03-24T10:47:25.264 DEBUG:teuthology.orchestra.run.vm05:> sudo apt-key list | grep Ceph 2026-03-24T10:47:25.348 INFO:teuthology.orchestra.run.vm05.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-24T10:47:25.368 INFO:teuthology.orchestra.run.vm05.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-24T10:47:25.368 INFO:teuthology.orchestra.run.vm05.stdout:uid [ unknown] Ceph.com (release key) 2026-03-24T10:47:25.368 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-24T10:47:25.369 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-jmespath, python3-xmltodict, s3cmd on remote deb x86_64 2026-03-24T10:47:25.369 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T10:47:25.974 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default/ 2026-03-24T10:47:25.974 INFO:teuthology.task.install.deb:Package version is 20.2.0-712-g70f8415b-1jammy 2026-03-24T10:47:26.449 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:47:26.450 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-24T10:47:26.458 DEBUG:teuthology.orchestra.run.vm05:> sudo apt-get update 2026-03-24T10:47:26.621 INFO:teuthology.orchestra.run.vm05.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-24T10:47:26.622 INFO:teuthology.orchestra.run.vm05.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-24T10:47:26.652 INFO:teuthology.orchestra.run.vm05.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-24T10:47:26.687 INFO:teuthology.orchestra.run.vm05.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-24T10:47:27.190 INFO:teuthology.orchestra.run.vm05.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy InRelease 2026-03-24T10:47:27.304 INFO:teuthology.orchestra.run.vm05.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release [7680 B] 2026-03-24T10:47:27.418 INFO:teuthology.orchestra.run.vm05.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-24T10:47:27.532 INFO:teuthology.orchestra.run.vm05.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.8 kB] 2026-03-24T10:47:27.613 INFO:teuthology.orchestra.run.vm05.stdout:Fetched 26.5 kB in 1s (26.6 kB/s) 2026-03-24T10:47:28.370 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T10:47:28.384 DEBUG:teuthology.orchestra.run.vm05:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=20.2.0-712-g70f8415b-1jammy cephadm=20.2.0-712-g70f8415b-1jammy ceph-mds=20.2.0-712-g70f8415b-1jammy ceph-mgr=20.2.0-712-g70f8415b-1jammy ceph-common=20.2.0-712-g70f8415b-1jammy ceph-fuse=20.2.0-712-g70f8415b-1jammy ceph-test=20.2.0-712-g70f8415b-1jammy ceph-volume=20.2.0-712-g70f8415b-1jammy radosgw=20.2.0-712-g70f8415b-1jammy python3-rados=20.2.0-712-g70f8415b-1jammy python3-rgw=20.2.0-712-g70f8415b-1jammy python3-cephfs=20.2.0-712-g70f8415b-1jammy python3-rbd=20.2.0-712-g70f8415b-1jammy libcephfs2=20.2.0-712-g70f8415b-1jammy libcephfs-dev=20.2.0-712-g70f8415b-1jammy librados2=20.2.0-712-g70f8415b-1jammy librbd1=20.2.0-712-g70f8415b-1jammy rbd-fuse=20.2.0-712-g70f8415b-1jammy 2026-03-24T10:47:28.418 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T10:47:28.621 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T10:47:28.622 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T10:47:28.825 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T10:47:28.826 INFO:teuthology.orchestra.run.vm05.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T10:47:28.826 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T10:47:28.826 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T10:47:28.826 INFO:teuthology.orchestra.run.vm05.stdout:The following additional packages will be installed: 2026-03-24T10:47:28.826 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon libcephfs-proxy2 libdouble-conversion3 libfuse2 libjq1 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: liblttng-ust1 libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 librgw2 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: libsqlite3-mod-ceph libthrift-0.16.0 nvme-cli python-asyncssh-doc 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-argparse 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-iniconfig python3-jaraco.classes python3-jaraco.collections 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-pluggy python3-portend 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytest python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-toml python3-wcwidth 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob python3-websocket python3-zc.lockfile qttranslations5-l10n 2026-03-24T10:47:28.827 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools socat xmlstarlet 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout:Suggested packages: 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout: python3-influxdb liblua5.3-dev luarocks python-natsort-doc python-psutil-doc 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout: subversion python-pygments-doc ttf-bitstream-vera python3-paste python3-dap 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout: python-sklearn-doc ipython3 python-webob-doc gsmartcontrol smart-notifier 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout: mailx | mailutils 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout:Recommended packages: 2026-03-24T10:47:28.828 INFO:teuthology.orchestra.run.vm05.stdout: btrfs-tools 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout:The following NEW packages will be installed: 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon libcephfs-dev libcephfs-proxy2 libcephfs2 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 libnbd0 liboath0 2026-03-24T10:47:28.877 INFO:teuthology.orchestra.run.vm05.stdout: libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 librdkafka1 librgw2 libsqlite3-mod-ceph libthrift-0.16.0 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-natsort 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-pluggy python3-portend python3-prettytable python3-psutil python3-py 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-pygments python3-pytest python3-rados python3-rbd python3-repoze.lru 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-rgw python3-routes python3-rsa 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-threadpoolctl python3-toml python3-wcwidth python3-webob 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse 2026-03-24T10:47:28.878 INFO:teuthology.orchestra.run.vm05.stdout: smartmontools socat xmlstarlet 2026-03-24T10:47:28.879 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be upgraded: 2026-03-24T10:47:28.879 INFO:teuthology.orchestra.run.vm05.stdout: librados2 librbd1 2026-03-24T10:47:29.089 INFO:teuthology.orchestra.run.vm05.stdout:2 upgraded, 85 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T10:47:29.089 INFO:teuthology.orchestra.run.vm05.stdout:Need to get 281 MB of archives. 2026-03-24T10:47:29.089 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 1092 MB of additional disk space will be used. 2026-03-24T10:47:29.089 INFO:teuthology.orchestra.run.vm05.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-24T10:47:29.467 INFO:teuthology.orchestra.run.vm05.stdout:Get:2 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 20.2.0-712-g70f8415b-1jammy [2867 kB] 2026-03-24T10:47:29.580 INFO:teuthology.orchestra.run.vm05.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-24T10:47:29.595 INFO:teuthology.orchestra.run.vm05.stdout:Get:4 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-24T10:47:29.690 INFO:teuthology.orchestra.run.vm05.stdout:Get:5 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-24T10:47:29.963 INFO:teuthology.orchestra.run.vm05.stdout:Get:6 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-24T10:47:29.979 INFO:teuthology.orchestra.run.vm05.stdout:Get:7 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-24T10:47:30.015 INFO:teuthology.orchestra.run.vm05.stdout:Get:8 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-24T10:47:30.026 INFO:teuthology.orchestra.run.vm05.stdout:Get:9 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-24T10:47:30.029 INFO:teuthology.orchestra.run.vm05.stdout:Get:10 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-24T10:47:30.029 INFO:teuthology.orchestra.run.vm05.stdout:Get:11 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-24T10:47:30.031 INFO:teuthology.orchestra.run.vm05.stdout:Get:12 http://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-24T10:47:30.053 INFO:teuthology.orchestra.run.vm05.stdout:Get:13 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-24T10:47:30.053 INFO:teuthology.orchestra.run.vm05.stdout:Get:14 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-24T10:47:30.060 INFO:teuthology.orchestra.run.vm05.stdout:Get:15 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-24T10:47:30.152 INFO:teuthology.orchestra.run.vm05.stdout:Get:16 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-24T10:47:30.152 INFO:teuthology.orchestra.run.vm05.stdout:Get:17 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-24T10:47:30.152 INFO:teuthology.orchestra.run.vm05.stdout:Get:18 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-24T10:47:30.152 INFO:teuthology.orchestra.run.vm05.stdout:Get:19 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-24T10:47:30.153 INFO:teuthology.orchestra.run.vm05.stdout:Get:20 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-24T10:47:30.153 INFO:teuthology.orchestra.run.vm05.stdout:Get:21 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-24T10:47:30.153 INFO:teuthology.orchestra.run.vm05.stdout:Get:22 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-24T10:47:30.155 INFO:teuthology.orchestra.run.vm05.stdout:Get:23 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-24T10:47:30.157 INFO:teuthology.orchestra.run.vm05.stdout:Get:24 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-24T10:47:30.252 INFO:teuthology.orchestra.run.vm05.stdout:Get:25 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-24T10:47:30.255 INFO:teuthology.orchestra.run.vm05.stdout:Get:26 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-24T10:47:30.255 INFO:teuthology.orchestra.run.vm05.stdout:Get:27 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-24T10:47:30.256 INFO:teuthology.orchestra.run.vm05.stdout:Get:28 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-24T10:47:30.266 INFO:teuthology.orchestra.run.vm05.stdout:Get:29 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 20.2.0-712-g70f8415b-1jammy [3583 kB] 2026-03-24T10:47:30.376 INFO:teuthology.orchestra.run.vm05.stdout:Get:30 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-24T10:47:30.377 INFO:teuthology.orchestra.run.vm05.stdout:Get:31 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-24T10:47:30.377 INFO:teuthology.orchestra.run.vm05.stdout:Get:32 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:33 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:34 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:35 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:36 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:37 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-24T10:47:30.472 INFO:teuthology.orchestra.run.vm05.stdout:Get:38 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-24T10:47:30.502 INFO:teuthology.orchestra.run.vm05.stdout:Get:39 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 20.2.0-712-g70f8415b-1jammy [829 kB] 2026-03-24T10:47:30.508 INFO:teuthology.orchestra.run.vm05.stdout:Get:40 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 20.2.0-712-g70f8415b-1jammy [364 kB] 2026-03-24T10:47:30.512 INFO:teuthology.orchestra.run.vm05.stdout:Get:41 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 20.2.0-712-g70f8415b-1jammy [32.8 kB] 2026-03-24T10:47:30.512 INFO:teuthology.orchestra.run.vm05.stdout:Get:42 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 20.2.0-712-g70f8415b-1jammy [184 kB] 2026-03-24T10:47:30.515 INFO:teuthology.orchestra.run.vm05.stdout:Get:43 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 20.2.0-712-g70f8415b-1jammy [83.8 kB] 2026-03-24T10:47:30.517 INFO:teuthology.orchestra.run.vm05.stdout:Get:44 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 20.2.0-712-g70f8415b-1jammy [341 kB] 2026-03-24T10:47:30.522 INFO:teuthology.orchestra.run.vm05.stdout:Get:45 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 20.2.0-712-g70f8415b-1jammy [8697 kB] 2026-03-24T10:47:30.565 INFO:teuthology.orchestra.run.vm05.stdout:Get:46 http://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-24T10:47:30.567 INFO:teuthology.orchestra.run.vm05.stdout:Get:47 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-24T10:47:30.568 INFO:teuthology.orchestra.run.vm05.stdout:Get:48 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-24T10:47:30.662 INFO:teuthology.orchestra.run.vm05.stdout:Get:49 http://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-24T10:47:30.666 INFO:teuthology.orchestra.run.vm05.stdout:Get:50 http://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-24T10:47:30.669 INFO:teuthology.orchestra.run.vm05.stdout:Get:51 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-24T10:47:30.674 INFO:teuthology.orchestra.run.vm05.stdout:Get:52 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-24T10:47:30.677 INFO:teuthology.orchestra.run.vm05.stdout:Get:53 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-24T10:47:30.678 INFO:teuthology.orchestra.run.vm05.stdout:Get:54 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-24T10:47:30.678 INFO:teuthology.orchestra.run.vm05.stdout:Get:55 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-24T10:47:30.759 INFO:teuthology.orchestra.run.vm05.stdout:Get:56 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-24T10:47:30.759 INFO:teuthology.orchestra.run.vm05.stdout:Get:57 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-24T10:47:30.768 INFO:teuthology.orchestra.run.vm05.stdout:Get:58 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-24T10:47:30.856 INFO:teuthology.orchestra.run.vm05.stdout:Get:59 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-24T10:47:30.859 INFO:teuthology.orchestra.run.vm05.stdout:Get:60 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-24T10:47:30.859 INFO:teuthology.orchestra.run.vm05.stdout:Get:61 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-24T10:47:30.860 INFO:teuthology.orchestra.run.vm05.stdout:Get:62 http://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-24T10:47:30.883 INFO:teuthology.orchestra.run.vm05.stdout:Get:63 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-24T10:47:30.959 INFO:teuthology.orchestra.run.vm05.stdout:Get:64 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 20.2.0-712-g70f8415b-1jammy [112 kB] 2026-03-24T10:47:30.960 INFO:teuthology.orchestra.run.vm05.stdout:Get:65 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 20.2.0-712-g70f8415b-1jammy [261 kB] 2026-03-24T10:47:30.961 INFO:teuthology.orchestra.run.vm05.stdout:Get:66 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 20.2.0-712-g70f8415b-1jammy [29.3 MB] 2026-03-24T10:47:32.205 INFO:teuthology.orchestra.run.vm05.stdout:Get:67 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 20.2.0-712-g70f8415b-1jammy [5415 kB] 2026-03-24T10:47:32.427 INFO:teuthology.orchestra.run.vm05.stdout:Get:68 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 20.2.0-712-g70f8415b-1jammy [246 kB] 2026-03-24T10:47:32.431 INFO:teuthology.orchestra.run.vm05.stdout:Get:69 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 20.2.0-712-g70f8415b-1jammy [124 kB] 2026-03-24T10:47:32.432 INFO:teuthology.orchestra.run.vm05.stdout:Get:70 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 20.2.0-712-g70f8415b-1jammy [906 kB] 2026-03-24T10:47:32.441 INFO:teuthology.orchestra.run.vm05.stdout:Get:71 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 20.2.0-712-g70f8415b-1jammy [6399 kB] 2026-03-24T10:47:32.680 INFO:teuthology.orchestra.run.vm05.stdout:Get:72 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 20.2.0-712-g70f8415b-1jammy [21.7 MB] 2026-03-24T10:47:33.600 INFO:teuthology.orchestra.run.vm05.stdout:Get:73 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 20.2.0-712-g70f8415b-1jammy [14.1 kB] 2026-03-24T10:47:33.600 INFO:teuthology.orchestra.run.vm05.stdout:Get:74 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 20.2.0-712-g70f8415b-1jammy [955 kB] 2026-03-24T10:47:33.683 INFO:teuthology.orchestra.run.vm05.stdout:Get:75 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 20.2.0-712-g70f8415b-1jammy [2341 kB] 2026-03-24T10:47:33.738 INFO:teuthology.orchestra.run.vm05.stdout:Get:76 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 20.2.0-712-g70f8415b-1jammy [1049 kB] 2026-03-24T10:47:33.805 INFO:teuthology.orchestra.run.vm05.stdout:Get:77 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 20.2.0-712-g70f8415b-1jammy [179 kB] 2026-03-24T10:47:33.919 INFO:teuthology.orchestra.run.vm05.stdout:Get:78 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 20.2.0-712-g70f8415b-1jammy [45.5 MB] 2026-03-24T10:47:36.886 INFO:teuthology.orchestra.run.vm05.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 20.2.0-712-g70f8415b-1jammy [8625 kB] 2026-03-24T10:47:37.430 INFO:teuthology.orchestra.run.vm05.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 20.2.0-712-g70f8415b-1jammy [14.2 kB] 2026-03-24T10:47:37.430 INFO:teuthology.orchestra.run.vm05.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 20.2.0-712-g70f8415b-1jammy [99.5 MB] 2026-03-24T10:47:43.666 INFO:teuthology.orchestra.run.vm05.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 20.2.0-712-g70f8415b-1jammy [135 kB] 2026-03-24T10:47:43.666 INFO:teuthology.orchestra.run.vm05.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-daemon amd64 20.2.0-712-g70f8415b-1jammy [43.3 kB] 2026-03-24T10:47:43.667 INFO:teuthology.orchestra.run.vm05.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-proxy2 amd64 20.2.0-712-g70f8415b-1jammy [30.7 kB] 2026-03-24T10:47:43.667 INFO:teuthology.orchestra.run.vm05.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 20.2.0-712-g70f8415b-1jammy [41.5 kB] 2026-03-24T10:47:43.667 INFO:teuthology.orchestra.run.vm05.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 20.2.0-712-g70f8415b-1jammy [25.1 MB] 2026-03-24T10:47:45.147 INFO:teuthology.orchestra.run.vm05.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/tentacle/70f8415b300f041766fa27faf7d5472699e32388/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 20.2.0-712-g70f8415b-1jammy [97.9 kB] 2026-03-24T10:47:45.436 INFO:teuthology.orchestra.run.vm05.stdout:Fetched 281 MB in 16s (17.3 MB/s) 2026-03-24T10:47:45.643 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-24T10:47:45.677 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 119262 files and directories currently installed.) 2026-03-24T10:47:45.679 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../00-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-24T10:47:45.682 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T10:47:45.703 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-24T10:47:45.710 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../01-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-24T10:47:45.711 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T10:47:45.728 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-24T10:47:45.735 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../02-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-24T10:47:45.735 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T10:47:45.756 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-24T10:47:45.762 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../03-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T10:47:45.767 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:45.811 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-24T10:47:45.817 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../04-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T10:47:45.818 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:45.838 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-24T10:47:45.844 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../05-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-24T10:47:45.845 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:45.871 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-24T10:47:45.879 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../06-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-24T10:47:45.880 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T10:47:45.905 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../07-librbd1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:45.907 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking librbd1 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-24T10:47:45.980 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../08-librados2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:45.982 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking librados2 (20.2.0-712-g70f8415b-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-24T10:47:46.051 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libnbd0. 2026-03-24T10:47:46.058 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../09-libnbd0_1.10.5-1_amd64.deb ... 2026-03-24T10:47:46.058 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-24T10:47:46.074 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libcephfs2. 2026-03-24T10:47:46.080 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../10-libcephfs2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.081 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.107 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-rados. 2026-03-24T10:47:46.113 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../11-python3-rados_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.113 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.133 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-24T10:47:46.139 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../12-python3-ceph-argparse_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:46.140 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.155 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-cephfs. 2026-03-24T10:47:46.162 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../13-python3-cephfs_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.162 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.181 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-24T10:47:46.189 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../14-python3-ceph-common_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:46.190 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.213 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-24T10:47:46.220 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../15-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-24T10:47:46.221 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T10:47:46.240 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-prettytable. 2026-03-24T10:47:46.247 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../16-python3-prettytable_2.5.0-2_all.deb ... 2026-03-24T10:47:46.248 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-24T10:47:46.264 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-rbd. 2026-03-24T10:47:46.271 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../17-python3-rbd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.272 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.294 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-24T10:47:46.300 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../18-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-24T10:47:46.301 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T10:47:46.323 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package librgw2. 2026-03-24T10:47:46.329 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../19-librgw2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.330 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.493 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-rgw. 2026-03-24T10:47:46.500 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../20-python3-rgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.500 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.652 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-24T10:47:46.658 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../21-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-24T10:47:46.659 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T10:47:46.676 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libradosstriper1. 2026-03-24T10:47:46.682 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../22-libradosstriper1_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.683 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:46.709 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-common. 2026-03-24T10:47:46.718 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../23-ceph-common_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:46.719 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:47.654 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-base. 2026-03-24T10:47:47.662 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../24-ceph-base_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:47.667 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:47.773 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-24T10:47:47.780 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../25-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-24T10:47:47.781 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-24T10:47:47.797 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-cheroot. 2026-03-24T10:47:47.805 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../26-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-24T10:47:47.806 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T10:47:47.837 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-24T10:47:47.844 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../27-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-24T10:47:47.845 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-24T10:47:47.861 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-24T10:47:47.867 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../28-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-24T10:47:47.868 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-24T10:47:47.884 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-24T10:47:47.890 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../29-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-24T10:47:47.891 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-24T10:47:47.908 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-tempora. 2026-03-24T10:47:47.916 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../30-python3-tempora_4.1.2-1_all.deb ... 2026-03-24T10:47:47.917 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-24T10:47:47.935 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-portend. 2026-03-24T10:47:47.943 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../31-python3-portend_3.0.0-1_all.deb ... 2026-03-24T10:47:47.944 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-24T10:47:47.962 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-24T10:47:47.968 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../32-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-24T10:47:47.969 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-24T10:47:47.986 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-24T10:47:47.992 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../33-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-24T10:47:47.993 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-24T10:47:48.022 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-natsort. 2026-03-24T10:47:48.029 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../34-python3-natsort_8.0.2-1_all.deb ... 2026-03-24T10:47:48.030 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-24T10:47:48.047 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-24T10:47:48.054 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../35-ceph-mgr-modules-core_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:48.055 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.091 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-24T10:47:48.099 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../36-libsqlite3-mod-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.100 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.118 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr. 2026-03-24T10:47:48.124 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../37-ceph-mgr_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.125 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.155 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mon. 2026-03-24T10:47:48.162 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../38-ceph-mon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.163 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.270 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-24T10:47:48.278 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../39-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-24T10:47:48.279 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T10:47:48.300 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-osd. 2026-03-24T10:47:48.308 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../40-ceph-osd_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.309 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.615 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph. 2026-03-24T10:47:48.621 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../41-ceph_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.622 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.641 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-fuse. 2026-03-24T10:47:48.649 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../42-ceph-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.650 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.681 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mds. 2026-03-24T10:47:48.688 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../43-ceph-mds_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.689 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.740 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package cephadm. 2026-03-24T10:47:48.746 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../44-cephadm_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:48.747 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.766 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-24T10:47:48.773 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../45-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-24T10:47:48.774 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T10:47:48.848 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-24T10:47:48.855 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../46-ceph-mgr-cephadm_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:48.856 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:48.883 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-24T10:47:48.890 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../47-python3-repoze.lru_0.7-2_all.deb ... 2026-03-24T10:47:48.891 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-24T10:47:48.909 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-routes. 2026-03-24T10:47:48.915 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../48-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-24T10:47:48.916 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T10:47:48.942 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-24T10:47:48.948 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../49-ceph-mgr-dashboard_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:48.949 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:49.691 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-24T10:47:49.698 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../50-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-24T10:47:49.699 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T10:47:49.764 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-joblib. 2026-03-24T10:47:49.771 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../51-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-24T10:47:49.772 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T10:47:49.806 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-24T10:47:49.812 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../52-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-24T10:47:49.813 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-24T10:47:49.829 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-sklearn. 2026-03-24T10:47:49.836 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../53-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-24T10:47:49.837 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T10:47:49.961 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-24T10:47:49.967 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../54-ceph-mgr-diskprediction-local_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:49.968 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:50.264 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-cachetools. 2026-03-24T10:47:50.270 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../55-python3-cachetools_5.0.0-1_all.deb ... 2026-03-24T10:47:50.271 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-24T10:47:50.288 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-rsa. 2026-03-24T10:47:50.295 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../56-python3-rsa_4.8-1_all.deb ... 2026-03-24T10:47:50.296 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-24T10:47:50.316 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-google-auth. 2026-03-24T10:47:50.323 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../57-python3-google-auth_1.5.1-3_all.deb ... 2026-03-24T10:47:50.324 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-24T10:47:50.345 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-24T10:47:50.352 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../58-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-24T10:47:50.353 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T10:47:50.371 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-websocket. 2026-03-24T10:47:50.377 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../59-python3-websocket_1.2.3-1_all.deb ... 2026-03-24T10:47:50.378 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-24T10:47:50.399 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-24T10:47:50.406 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../60-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-24T10:47:50.407 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T10:47:50.550 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-24T10:47:50.557 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../61-ceph-mgr-k8sevents_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:50.558 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:50.574 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-24T10:47:50.580 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../62-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-24T10:47:50.581 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T10:47:50.599 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-24T10:47:50.605 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../63-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-24T10:47:50.606 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T10:47:50.621 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package jq. 2026-03-24T10:47:50.627 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../64-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-24T10:47:50.628 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-24T10:47:50.643 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package socat. 2026-03-24T10:47:50.649 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../65-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-24T10:47:50.650 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-24T10:47:50.673 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package xmlstarlet. 2026-03-24T10:47:50.679 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../66-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-24T10:47:50.680 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-24T10:47:50.721 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-test. 2026-03-24T10:47:50.727 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../67-ceph-test_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:50.728 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:52.308 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package ceph-volume. 2026-03-24T10:47:52.314 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../68-ceph-volume_20.2.0-712-g70f8415b-1jammy_all.deb ... 2026-03-24T10:47:52.315 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:52.342 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libcephfs-daemon. 2026-03-24T10:47:52.348 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../69-libcephfs-daemon_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:52.349 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:52.364 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libcephfs-proxy2. 2026-03-24T10:47:52.372 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../70-libcephfs-proxy2_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:52.373 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:52.391 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-24T10:47:52.396 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../71-libcephfs-dev_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:52.397 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:52.414 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package nvme-cli. 2026-03-24T10:47:52.421 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../72-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-24T10:47:52.422 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T10:47:52.462 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-24T10:47:52.469 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../73-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-24T10:47:52.470 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T10:47:52.514 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-24T10:47:52.520 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../74-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-24T10:47:52.521 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-24T10:47:52.538 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-pluggy. 2026-03-24T10:47:52.544 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../75-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-24T10:47:52.545 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-24T10:47:52.564 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-psutil. 2026-03-24T10:47:52.570 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../76-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-24T10:47:52.571 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-24T10:47:52.595 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-py. 2026-03-24T10:47:52.599 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../77-python3-py_1.10.0-1_all.deb ... 2026-03-24T10:47:52.600 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-24T10:47:52.624 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-pygments. 2026-03-24T10:47:52.630 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../78-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-24T10:47:52.631 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-24T10:47:52.689 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-toml. 2026-03-24T10:47:52.696 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../79-python3-toml_0.10.2-1_all.deb ... 2026-03-24T10:47:52.697 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-24T10:47:52.715 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-pytest. 2026-03-24T10:47:52.721 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../80-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-24T10:47:52.722 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-24T10:47:52.763 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-simplejson. 2026-03-24T10:47:52.768 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../81-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-24T10:47:52.769 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-24T10:47:52.791 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-webob. 2026-03-24T10:47:52.797 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../82-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-24T10:47:52.798 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T10:47:52.817 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-24T10:47:52.824 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../83-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-24T10:47:52.825 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-24T10:47:52.936 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package radosgw. 2026-03-24T10:47:52.942 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../84-radosgw_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:52.943 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:53.330 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package rbd-fuse. 2026-03-24T10:47:53.336 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../85-rbd-fuse_20.2.0-712-g70f8415b-1jammy_amd64.deb ... 2026-03-24T10:47:53.338 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:53.356 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package smartmontools. 2026-03-24T10:47:53.363 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../86-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-24T10:47:53.371 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T10:47:53.416 INFO:teuthology.orchestra.run.vm05.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T10:47:53.650 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-24T10:47:53.650 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-24T10:47:54.018 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-24T10:47:54.083 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T10:47:54.085 INFO:teuthology.orchestra.run.vm05.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T10:47:54.150 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-24T10:47:54.383 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-24T10:47:54.761 INFO:teuthology.orchestra.run.vm05.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-24T10:47:54.782 INFO:teuthology.orchestra.run.vm05.stdout:Setting up cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:54.826 INFO:teuthology.orchestra.run.vm05.stdout:Adding system user cephadm....done 2026-03-24T10:47:54.835 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-24T10:47:54.900 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T10:47:54.902 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-24T10:47:54.968 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-24T10:47:55.044 INFO:teuthology.orchestra.run.vm05.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T10:47:55.047 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-24T10:47:55.139 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T10:47:55.261 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-24T10:47:55.327 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-24T10:47:55.392 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:55.462 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T10:47:55.464 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-24T10:47:55.466 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T10:47:55.468 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T10:47:55.470 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-24T10:47:55.595 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-24T10:47:55.667 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:55.669 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T10:47:55.743 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-24T10:47:55.826 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-24T10:47:56.098 INFO:teuthology.orchestra.run.vm05.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-24T10:47:56.102 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T10:47:56.191 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T10:47:56.333 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T10:47:56.420 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-24T10:47:56.489 INFO:teuthology.orchestra.run.vm05.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-24T10:47:56.491 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:56.587 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T10:47:57.129 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:57.134 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-24T10:47:57.205 INFO:teuthology.orchestra.run.vm05.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T10:47:57.207 INFO:teuthology.orchestra.run.vm05.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-24T10:47:57.210 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-24T10:47:57.279 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-24T10:47:57.345 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:57.347 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-24T10:47:57.418 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-24T10:47:57.487 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-24T10:47:57.561 INFO:teuthology.orchestra.run.vm05.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T10:47:57.564 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-24T10:47:57.642 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T10:47:57.644 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T10:47:57.718 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T10:47:57.809 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-24T10:47:57.876 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T10:47:57.878 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-24T10:47:58.008 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-24T10:47:58.076 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T10:47:58.079 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-24T10:47:58.159 INFO:teuthology.orchestra.run.vm05.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-24T10:47:58.161 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-24T10:47:58.290 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T10:47:58.292 INFO:teuthology.orchestra.run.vm05.stdout:Setting up librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.294 INFO:teuthology.orchestra.run.vm05.stdout:Setting up librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.297 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.299 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T10:47:58.839 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.841 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.843 INFO:teuthology.orchestra.run.vm05.stdout:Setting up librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.845 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.847 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:58.910 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-24T10:47:58.911 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-24T10:47:59.466 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.469 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.471 INFO:teuthology.orchestra.run.vm05.stdout:Setting up libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.474 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.476 INFO:teuthology.orchestra.run.vm05.stdout:Setting up rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.478 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.480 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.483 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:47:59.516 INFO:teuthology.orchestra.run.vm05.stdout:Adding group ceph....done 2026-03-24T10:47:59.550 INFO:teuthology.orchestra.run.vm05.stdout:Adding system user ceph....done 2026-03-24T10:47:59.558 INFO:teuthology.orchestra.run.vm05.stdout:Setting system user ceph properties....done 2026-03-24T10:47:59.562 INFO:teuthology.orchestra.run.vm05.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-24T10:47:59.628 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-24T10:47:59.866 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-24T10:48:00.232 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:00.234 INFO:teuthology.orchestra.run.vm05.stdout:Setting up radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:00.466 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-24T10:48:00.466 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-24T10:48:00.838 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:00.926 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-24T10:48:01.297 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:01.362 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-24T10:48:01.363 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-24T10:48:01.733 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:01.804 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-24T10:48:01.804 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-24T10:48:02.193 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:02.270 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-24T10:48:02.271 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-24T10:48:02.639 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:02.641 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:02.654 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:02.714 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-24T10:48:02.714 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-24T10:48:03.082 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:03.097 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:03.099 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:03.112 INFO:teuthology.orchestra.run.vm05.stdout:Setting up ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T10:48:03.228 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T10:48:03.303 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T10:48:03.594 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.594 INFO:teuthology.orchestra.run.vm05.stdout:Running kernel seems to be up-to-date. 2026-03-24T10:48:03.594 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.594 INFO:teuthology.orchestra.run.vm05.stdout:Services to be restarted: 2026-03-24T10:48:03.596 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart apache-htcacheclean.service 2026-03-24T10:48:03.601 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart rsyslog.service 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout:Service restarts being deferred: 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart networkd-dispatcher.service 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart unattended-upgrades.service 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout:No containers need to be restarted. 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout:No user sessions are running outdated binaries. 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:03.604 INFO:teuthology.orchestra.run.vm05.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-24T10:48:04.476 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T10:48:04.479 DEBUG:teuthology.orchestra.run.vm05:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-jmespath python3-xmltodict s3cmd 2026-03-24T10:48:04.558 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T10:48:04.736 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T10:48:04.736 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T10:48:04.879 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T10:48:04.879 INFO:teuthology.orchestra.run.vm05.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T10:48:04.879 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T10:48:04.880 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T10:48:04.897 INFO:teuthology.orchestra.run.vm05.stdout:The following NEW packages will be installed: 2026-03-24T10:48:04.897 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath python3-xmltodict s3cmd 2026-03-24T10:48:04.980 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 3 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T10:48:04.980 INFO:teuthology.orchestra.run.vm05.stdout:Need to get 155 kB of archives. 2026-03-24T10:48:04.980 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 678 kB of additional disk space will be used. 2026-03-24T10:48:04.980 INFO:teuthology.orchestra.run.vm05.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-24T10:48:05.056 INFO:teuthology.orchestra.run.vm05.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-24T10:48:05.064 INFO:teuthology.orchestra.run.vm05.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 s3cmd all 2.2.0-1 [120 kB] 2026-03-24T10:48:05.340 INFO:teuthology.orchestra.run.vm05.stdout:Fetched 155 kB in 0s (642 kB/s) 2026-03-24T10:48:05.355 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-jmespath. 2026-03-24T10:48:05.386 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126082 files and directories currently installed.) 2026-03-24T10:48:05.388 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-24T10:48:05.390 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-24T10:48:05.406 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-24T10:48:05.412 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-24T10:48:05.413 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-24T10:48:05.428 INFO:teuthology.orchestra.run.vm05.stdout:Selecting previously unselected package s3cmd. 2026-03-24T10:48:05.434 INFO:teuthology.orchestra.run.vm05.stdout:Preparing to unpack .../archives/s3cmd_2.2.0-1_all.deb ... 2026-03-24T10:48:05.435 INFO:teuthology.orchestra.run.vm05.stdout:Unpacking s3cmd (2.2.0-1) ... 2026-03-24T10:48:05.469 INFO:teuthology.orchestra.run.vm05.stdout:Setting up s3cmd (2.2.0-1) ... 2026-03-24T10:48:05.563 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-24T10:48:05.631 INFO:teuthology.orchestra.run.vm05.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-24T10:48:05.705 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T10:48:06.024 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.024 INFO:teuthology.orchestra.run.vm05.stdout:Running kernel seems to be up-to-date. 2026-03-24T10:48:06.024 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.024 INFO:teuthology.orchestra.run.vm05.stdout:Services to be restarted: 2026-03-24T10:48:06.027 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart apache-htcacheclean.service 2026-03-24T10:48:06.032 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart rsyslog.service 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout:Service restarts being deferred: 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart networkd-dispatcher.service 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: systemctl restart unattended-upgrades.service 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout:No containers need to be restarted. 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout:No user sessions are running outdated binaries. 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:06.036 INFO:teuthology.orchestra.run.vm05.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-24T10:48:06.814 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T10:48:06.818 DEBUG:teuthology.parallel:result is None 2026-03-24T10:48:06.818 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=70f8415b300f041766fa27faf7d5472699e32388 2026-03-24T10:48:07.480 DEBUG:teuthology.orchestra.run.vm05:> dpkg-query -W -f '${Version}' ceph 2026-03-24T10:48:07.489 INFO:teuthology.orchestra.run.vm05.stdout:20.2.0-712-g70f8415b-1jammy 2026-03-24T10:48:07.489 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-712-g70f8415b-1jammy 2026-03-24T10:48:07.489 INFO:teuthology.task.install:The correct ceph version 20.2.0-712-g70f8415b-1jammy is installed. 2026-03-24T10:48:07.490 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-24T10:48:07.490 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:07.490 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-24T10:48:07.543 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-24T10:48:07.543 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:07.543 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/daemon-helper 2026-03-24T10:48:07.591 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-24T10:48:07.643 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-24T10:48:07.643 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:07.643 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-24T10:48:07.695 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-24T10:48:07.743 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-24T10:48:07.743 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:07.743 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/stdin-killer 2026-03-24T10:48:07.795 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-24T10:48:07.846 INFO:teuthology.run_tasks:Running task ceph... 2026-03-24T10:48:07.892 INFO:tasks.ceph:Making ceph log dir writeable by non-root... 2026-03-24T10:48:07.892 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 777 /var/log/ceph 2026-03-24T10:48:07.900 INFO:tasks.ceph:Disabling ceph logrotate... 2026-03-24T10:48:07.900 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/logrotate.d/ceph 2026-03-24T10:48:07.950 INFO:tasks.ceph:Creating extra log directories... 2026-03-24T10:48:07.950 DEBUG:teuthology.orchestra.run.vm05:> sudo install -d -m0777 -- /var/log/ceph/valgrind /var/log/ceph/profiling-logger 2026-03-24T10:48:08.001 INFO:tasks.ceph:Creating ceph cluster ceph... 2026-03-24T10:48:08.002 INFO:tasks.ceph:config {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'fs': 'xfs', 'mkfs_options': None, 'mount_options': None, 'skip_mgr_daemons': False, 'log_ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'cpu_profile': set(), 'cluster': 'ceph', 'mon_bind_msgr2': True, 'mon_bind_addrvec': True} 2026-03-24T10:48:08.002 INFO:tasks.ceph:ctx.config {'archive_path': '/archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589', 'branch': 'tentacle', 'description': 'rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-lz4 supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic}', 'email': None, 'first_in_suite': False, 'flavor': 'default', 'job_id': '3589', 'ktype': 'distro', 'last_in_suite': False, 'machine_type': 'vps', 'name': 'kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps', 'no_nested_subset': False, 'os_type': 'ubuntu', 'os_version': '22.04', 'overrides': {'admin_socket': {'branch': 'tentacle'}, 'ansible.cephlab': {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}}, 'ceph': {'conf': {'client': {'rbd default format': 1}, 'global': {'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20}, 'osd': {'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', '\\(OSD_SLOW_PING_TIME'], 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'ceph-deploy': {'conf': {'client': {'log file': '/var/log/ceph/ceph-$name.$pid.log'}, 'global': {'osd crush chooseleaf type': 0, 'osd pool default pg num': 128, 'osd pool default pgp num': 128, 'osd pool default size': 2}, 'mon': {}}}, 'cephadm': {'cephadm_binary_url': 'https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm'}, 'install': {'ceph': {'flavor': 'default', 'sha1': '70f8415b300f041766fa27faf7d5472699e32388'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}}, 'thrashosds': {'bdev_inject_crash': 2, 'bdev_inject_crash_probability': 0.5}, 'workunit': {'branch': 'tt-tentacle', 'sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4'}}, 'owner': 'kyr', 'priority': 1000, 'repo': 'https://github.com/ceph/ceph.git', 'roles': [['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0']], 'seed': 3051, 'sha1': '70f8415b300f041766fa27faf7d5472699e32388', 'sleep_before_teardown': 0, 'subset': '1/128', 'suite': 'rbd', 'suite_branch': 'tt-tentacle', 'suite_path': '/home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa', 'suite_relpath': 'qa', 'suite_repo': 'https://github.com/kshtsk/ceph.git', 'suite_sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4', 'targets': {'vm05.local': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAVJJjIoDa2cTjdkjKgs/H+4qhgK0vF9IZI84LLvKo+3+ZBDA+acNPK9sl3SYRY6paO1CwgelHH/nU7wFhY4+34='}, 'tasks': [{'internal.check_packages': None}, {'internal.buildpackages_prep': None}, {'internal.save_config': None}, {'internal.check_lock': None}, {'internal.add_remotes': None}, {'console_log': None}, {'internal.connect': None}, {'internal.push_inventory': None}, {'internal.serialize_remote_roles': None}, {'internal.check_conflict': None}, {'internal.check_ceph_data': None}, {'internal.vm_setup': None}, {'internal.base': None}, {'internal.archive_upload': None}, {'internal.archive': None}, {'internal.coredump': None}, {'internal.sudo': None}, {'internal.syslog': None}, {'internal.timer': None}, {'pcp': None}, {'selinux': None}, {'ansible.cephlab': None}, {'clock': None}, {'install': None}, {'ceph': None}, {'workunit': {'clients': {'client.0': ['rbd/cli_generic.sh']}}}], 'teuthology': {'fragments_dropped': [], 'meta': {}, 'postmerge': []}, 'teuthology_branch': 'clyso-debian-13', 'teuthology_repo': 'https://github.com/clyso/teuthology', 'teuthology_sha1': '1c580df7a9c7c2aadc272da296344fd99f27c444', 'timestamp': '2026-03-20_22:04:26', 'tube': 'vps', 'user': 'kyr', 'verbose': False, 'worker_log': '/home/teuthos/.teuthology/dispatcher/dispatcher.vps.2366871'} 2026-03-24T10:48:08.002 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/ceph.data 2026-03-24T10:48:08.042 DEBUG:teuthology.orchestra.run.vm05:> sudo install -d -m0777 -- /var/run/ceph 2026-03-24T10:48:08.091 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:08.092 DEBUG:teuthology.orchestra.run.vm05:> dd if=/scratch_devs of=/dev/stdout 2026-03-24T10:48:08.138 DEBUG:teuthology.misc:devs=['/dev/vg_nvme/lv_1', '/dev/vg_nvme/lv_2', '/dev/vg_nvme/lv_3', '/dev/vg_nvme/lv_4'] 2026-03-24T10:48:08.138 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vg_nvme/lv_1 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vg_nvme/lv_1 -> ../dm-0 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5h/5d Inode: 776 Links: 1 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-24 10:47:09.079111000 +0000 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-24 10:47:08.935111000 +0000 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-24 10:47:08.935111000 +0000 2026-03-24T10:48:08.183 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-24T10:48:08.183 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vg_nvme/lv_1 of=/dev/null count=1 2026-03-24T10:48:08.231 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-24T10:48:08.231 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-24T10:48:08.231 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000176241 s, 2.9 MB/s 2026-03-24T10:48:08.231 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_1 2026-03-24T10:48:08.280 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vg_nvme/lv_2 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vg_nvme/lv_2 -> ../dm-1 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5h/5d Inode: 807 Links: 1 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-24 10:47:09.359111000 +0000 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-24 10:47:09.231111000 +0000 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-24 10:47:09.231111000 +0000 2026-03-24T10:48:08.326 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-24T10:48:08.327 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vg_nvme/lv_2 of=/dev/null count=1 2026-03-24T10:48:08.375 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-24T10:48:08.375 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-24T10:48:08.375 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000203841 s, 2.5 MB/s 2026-03-24T10:48:08.375 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_2 2026-03-24T10:48:08.424 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vg_nvme/lv_3 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vg_nvme/lv_3 -> ../dm-2 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5h/5d Inode: 841 Links: 1 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-24 10:47:09.651111000 +0000 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-24 10:47:09.511111000 +0000 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-24 10:47:09.511111000 +0000 2026-03-24T10:48:08.471 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-24T10:48:08.471 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vg_nvme/lv_3 of=/dev/null count=1 2026-03-24T10:48:08.519 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-24T10:48:08.519 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-24T10:48:08.519 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000181669 s, 2.8 MB/s 2026-03-24T10:48:08.520 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_3 2026-03-24T10:48:08.567 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vg_nvme/lv_4 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vg_nvme/lv_4 -> ../dm-3 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5h/5d Inode: 872 Links: 1 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-24 10:47:14.027111000 +0000 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-24 10:47:09.819111000 +0000 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-24 10:47:09.819111000 +0000 2026-03-24T10:48:08.615 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-24T10:48:08.615 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vg_nvme/lv_4 of=/dev/null count=1 2026-03-24T10:48:08.662 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-24T10:48:08.663 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-24T10:48:08.663 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000174957 s, 2.9 MB/s 2026-03-24T10:48:08.663 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_4 2026-03-24T10:48:08.711 INFO:tasks.ceph:osd dev map: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T10:48:08.711 INFO:tasks.ceph:remote_to_roles_to_devs: {Remote(name='ubuntu@vm05.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-24T10:48:08.711 INFO:tasks.ceph:Generating config... 2026-03-24T10:48:08.712 INFO:tasks.ceph:[client] rbd default format = 1 2026-03-24T10:48:08.712 INFO:tasks.ceph:[global] mon client directed command retry = 5 2026-03-24T10:48:08.712 INFO:tasks.ceph:[global] mon warn on pool no app = False 2026-03-24T10:48:08.712 INFO:tasks.ceph:[global] ms inject socket failures = 5000 2026-03-24T10:48:08.712 INFO:tasks.ceph:[mgr] debug mgr = 20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[mgr] debug ms = 1 2026-03-24T10:48:08.712 INFO:tasks.ceph:[mon] debug mon = 20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[mon] debug ms = 1 2026-03-24T10:48:08.712 INFO:tasks.ceph:[mon] debug paxos = 20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] bluestore block size = 96636764160 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] bluestore compression algorithm = lz4 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] bluestore compression mode = aggressive 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] bluestore fsck on mount = True 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] debug bluefs = 1/20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] debug bluestore = 1/20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] debug ms = 1 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] debug osd = 20 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] debug rocksdb = 4/10 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] mon osd backfillfull_ratio = 0.85 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] mon osd full ratio = 0.9 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] mon osd nearfull ratio = 0.8 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] osd failsafe full ratio = 0.95 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] osd mclock iops capacity threshold hdd = 49000 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] osd objectstore = bluestore 2026-03-24T10:48:08.712 INFO:tasks.ceph:[osd] osd shutdown pgref assert = True 2026-03-24T10:48:08.712 INFO:tasks.ceph:Setting up mon.a... 2026-03-24T10:48:08.712 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring /etc/ceph/ceph.keyring 2026-03-24T10:48:08.773 INFO:teuthology.orchestra.run.vm05.stdout:creating /etc/ceph/ceph.keyring 2026-03-24T10:48:08.776 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=mon. /etc/ceph/ceph.keyring 2026-03-24T10:48:08.840 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-24T10:48:08.891 DEBUG:tasks.ceph:Ceph mon addresses: [('mon.a', '192.168.123.105')] 2026-03-24T10:48:08.891 DEBUG:tasks.ceph:writing out conf {'global': {'chdir': '', 'pid file': '/var/run/ceph/$cluster-$name.pid', 'auth supported': 'cephx', 'filestore xattr use omap': 'true', 'mon clock drift allowed': '1.000', 'osd crush chooseleaf type': '0', 'auth debug': 'true', 'ms die on old message': 'true', 'ms die on bug': 'true', 'mon max pg per osd': '10000', 'mon pg warn max object skew': '0', 'osd_pool_default_pg_autoscale_mode': 'off', 'osd pool default size': '2', 'mon osd allow primary affinity': 'true', 'mon osd allow pg remap': 'true', 'mon warn on legacy crush tunables': 'false', 'mon warn on crush straw calc version zero': 'false', 'mon warn on no sortbitwise': 'false', 'mon warn on osd down out interval zero': 'false', 'mon warn on too few osds': 'false', 'mon_warn_on_pool_pg_num_not_power_of_two': 'false', 'mon_warn_on_pool_no_redundancy': 'false', 'mon_allow_pool_size_one': 'true', 'osd pool default erasure code profile': 'plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd', 'osd default data pool replay window': '5', 'mon allow pool delete': 'true', 'mon cluster log file level': 'debug', 'debug asserts on shutdown': 'true', 'mon health detail to clog': 'false', 'mon host': '192.168.123.105', 'mon client directed command retry': 5, 'mon warn on pool no app': False, 'ms inject socket failures': 5000}, 'osd': {'osd journal size': '100', 'osd scrub load threshold': '5.0', 'osd scrub max interval': '600', 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': 'true', 'osd recover clone overlap': 'true', 'osd recovery max chunk': '1048576', 'osd debug shutdown': 'true', 'osd debug op order': 'true', 'osd debug verify stray on activate': 'true', 'osd debug trim objects': 'true', 'osd open classes on start': 'true', 'osd debug pg log writeout': 'true', 'osd deep scrub update digest min age': '30', 'osd map max advance': '10', 'journal zero on create': 'true', 'filestore ondisk finisher threads': '3', 'filestore apply finisher threads': '3', 'bdev debug aio': 'true', 'osd debug misdirected ops': 'true', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd shutdown pgref assert': True}, 'mgr': {'debug ms': 1, 'debug mgr': 20, 'debug mon': '20', 'debug auth': '20', 'mon reweight min pgs per osd': '4', 'mon reweight min bytes per osd': '10', 'mgr/telemetry/nag': 'false'}, 'mon': {'debug ms': 1, 'debug mon': 20, 'debug paxos': 20, 'debug auth': '20', 'mon data avail warn': '5', 'mon mgr mkfs grace': '240', 'mon reweight min pgs per osd': '4', 'mon osd reporter subtree level': 'osd', 'mon osd prime pg temp': 'true', 'mon reweight min bytes per osd': '10', 'auth mon ticket ttl': '660', 'auth service ticket ttl': '240', 'mon_warn_on_insecure_global_id_reclaim': 'false', 'mon_warn_on_insecure_global_id_reclaim_allowed': 'false', 'mon_down_mkfs_grace': '2m', 'mon_warn_on_filestore_osds': 'false'}, 'client': {'rgw cache enabled': 'true', 'rgw enable ops log': 'true', 'rgw enable usage log': 'true', 'log file': '/var/log/ceph/$cluster-$name.$pid.log', 'admin socket': '/var/run/ceph/$cluster-$name.$pid.asok', 'rbd default format': 1}, 'mon.a': {}} 2026-03-24T10:48:08.891 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:08.891 DEBUG:teuthology.orchestra.run.vm05:> dd of=/home/ubuntu/cephtest/ceph.tmp.conf 2026-03-24T10:48:08.934 DEBUG:teuthology.orchestra.run.vm05:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage monmaptool -c /home/ubuntu/cephtest/ceph.tmp.conf --create --clobber --enable-all-features --add a 192.168.123.105 --print /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:monmaptool: monmap file /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:monmaptool: generated fsid a0c8ea99-5654-4097-bf98-f2a2e799bc82 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:setting min_mon_release = tentacle 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:epoch 0 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:fsid a0c8ea99-5654-4097-bf98-f2a2e799bc82 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:last_changed 2026-03-24T10:48:08.989524+0000 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-24T10:48:08.989524+0000 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:min_mon_release 20 (tentacle) 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:election_strategy: 1 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.a 2026-03-24T10:48:08.993 INFO:teuthology.orchestra.run.vm05.stdout:monmaptool: writing epoch 0 to /home/ubuntu/cephtest/ceph.monmap (1 monitors) 2026-03-24T10:48:08.994 DEBUG:teuthology.orchestra.run.vm05:> rm -- /home/ubuntu/cephtest/ceph.tmp.conf 2026-03-24T10:48:09.038 INFO:tasks.ceph:Writing /etc/ceph/ceph.conf for FSID a0c8ea99-5654-4097-bf98-f2a2e799bc82... 2026-03-24T10:48:09.038 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /etc/ceph && sudo chmod 0755 /etc/ceph && sudo tee /etc/ceph/ceph.conf && sudo chmod 0644 /etc/ceph/ceph.conf > /dev/null 2026-03-24T10:48:09.097 INFO:teuthology.orchestra.run.vm05.stdout:[global] 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: chdir = "" 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: pid file = /var/run/ceph/$cluster-$name.pid 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: auth supported = cephx 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: filestore xattr use omap = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon clock drift allowed = 1.000 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd crush chooseleaf type = 0 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: auth debug = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: ms die on old message = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: ms die on bug = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon max pg per osd = 10000 # >= luminous 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon pg warn max object skew = 0 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: # disable pg_autoscaler by default for new pools 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd_pool_default_pg_autoscale_mode = off 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd pool default size = 2 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon osd allow primary affinity = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon osd allow pg remap = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on legacy crush tunables = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on crush straw calc version zero = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on no sortbitwise = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on osd down out interval zero = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on too few osds = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon_warn_on_pool_pg_num_not_power_of_two = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon_warn_on_pool_no_redundancy = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon_allow_pool_size_one = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd pool default erasure code profile = plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd default data pool replay window = 5 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon allow pool delete = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon cluster log file level = debug 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: debug asserts on shutdown = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon health detail to clog = false 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon host = 192.168.123.105 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon client directed command retry = 5 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: mon warn on pool no app = False 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: ms inject socket failures = 5000 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: fsid = a0c8ea99-5654-4097-bf98-f2a2e799bc82 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout:[osd] 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd journal size = 100 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd scrub load threshold = 5.0 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd scrub max interval = 600 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd mclock profile = high_recovery_ops 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd mclock skip benchmark = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd recover clone overlap = true 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd recovery max chunk = 1048576 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.098 INFO:teuthology.orchestra.run.vm05.stdout: osd debug shutdown = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd debug op order = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd debug verify stray on activate = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd debug trim objects = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd open classes on start = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd debug pg log writeout = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd deep scrub update digest min age = 30 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd map max advance = 10 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: journal zero on create = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: filestore ondisk finisher threads = 3 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: filestore apply finisher threads = 3 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: bdev debug aio = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd debug misdirected ops = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: bluestore block size = 96636764160 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: bluestore compression algorithm = lz4 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: bluestore compression mode = aggressive 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: bluestore fsck on mount = True 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug bluefs = 1/20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug bluestore = 1/20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug ms = 1 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug osd = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug rocksdb = 4/10 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon osd backfillfull_ratio = 0.85 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon osd full ratio = 0.9 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon osd nearfull ratio = 0.8 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd failsafe full ratio = 0.95 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd mclock iops capacity threshold hdd = 49000 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd objectstore = bluestore 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: osd shutdown pgref assert = True 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout:[mgr] 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug ms = 1 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug mgr = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug mon = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug auth = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon reweight min pgs per osd = 4 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon reweight min bytes per osd = 10 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mgr/telemetry/nag = false 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout:[mon] 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug ms = 1 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug mon = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug paxos = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: debug auth = 20 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon data avail warn = 5 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon mgr mkfs grace = 240 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon reweight min pgs per osd = 4 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon osd reporter subtree level = osd 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon osd prime pg temp = true 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon reweight min bytes per osd = 10 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: # rotate auth tickets quickly to exercise renewal paths 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: auth mon ticket ttl = 660 # 11m 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: auth service ticket ttl = 240 # 4m 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: # don't complain about insecure global_id in the test suite 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon_warn_on_insecure_global_id_reclaim = false 2026-03-24T10:48:09.099 INFO:teuthology.orchestra.run.vm05.stdout: mon_warn_on_insecure_global_id_reclaim_allowed = false 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: # 1m isn't quite enough 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: mon_down_mkfs_grace = 2m 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: mon_warn_on_filestore_osds = false 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout:[client] 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: rgw cache enabled = true 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: rgw enable ops log = true 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: rgw enable usage log = true 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: log file = /var/log/ceph/$cluster-$name.$pid.log 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: admin socket = /var/run/ceph/$cluster-$name.$pid.asok 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout: rbd default format = 1 2026-03-24T10:48:09.100 INFO:teuthology.orchestra.run.vm05.stdout:[mon.a] 2026-03-24T10:48:09.104 INFO:tasks.ceph:Creating admin key on mon.a... 2026-03-24T10:48:09.104 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=client.admin --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *' --cap mgr 'allow *' /etc/ceph/ceph.keyring 2026-03-24T10:48:09.172 INFO:tasks.ceph:Copying monmap to all nodes... 2026-03-24T10:48:09.172 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:09.172 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.keyring of=/dev/stdout 2026-03-24T10:48:09.218 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:09.218 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/ceph.monmap of=/dev/stdout 2026-03-24T10:48:09.262 INFO:tasks.ceph:Sending monmap to node ubuntu@vm05.local 2026-03-24T10:48:09.262 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:09.262 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.keyring 2026-03-24T10:48:09.262 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-24T10:48:09.317 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:09.317 DEBUG:teuthology.orchestra.run.vm05:> dd of=/home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:09.362 INFO:tasks.ceph:Setting up mon nodes... 2026-03-24T10:48:09.362 INFO:tasks.ceph:Setting up mgr nodes... 2026-03-24T10:48:09.362 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /var/lib/ceph/mgr/ceph-x && sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=mgr.x /var/lib/ceph/mgr/ceph-x/keyring 2026-03-24T10:48:09.426 INFO:teuthology.orchestra.run.vm05.stdout:creating /var/lib/ceph/mgr/ceph-x/keyring 2026-03-24T10:48:09.428 INFO:tasks.ceph:Setting up mds nodes... 2026-03-24T10:48:09.428 INFO:tasks.ceph_client:Setting up client nodes... 2026-03-24T10:48:09.428 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=client.0 /etc/ceph/ceph.client.0.keyring && sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-24T10:48:09.490 INFO:teuthology.orchestra.run.vm05.stdout:creating /etc/ceph/ceph.client.0.keyring 2026-03-24T10:48:09.498 INFO:tasks.ceph:Running mkfs on osd nodes... 2026-03-24T10:48:09.498 INFO:tasks.ceph:ctx.disk_config.remote_to_roles_to_dev: {Remote(name='ubuntu@vm05.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-24T10:48:09.498 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /var/lib/ceph/osd/ceph-0 2026-03-24T10:48:09.548 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T10:48:09.548 INFO:tasks.ceph:role: osd.0 2026-03-24T10:48:09.548 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_1 on ubuntu@vm05.local 2026-03-24T10:48:09.548 DEBUG:teuthology.orchestra.run.vm05:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_1 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout:meta-data=/dev/vg_nvme/lv_1 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout: = sunit=0 swidth=0 blks 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T10:48:09.601 INFO:teuthology.orchestra.run.vm05.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T10:48:09.606 INFO:teuthology.orchestra.run.vm05.stdout:Discarding blocks...Done. 2026-03-24T10:48:09.607 INFO:tasks.ceph:mount /dev/vg_nvme/lv_1 on ubuntu@vm05.local -o noatime 2026-03-24T10:48:09.607 DEBUG:teuthology.orchestra.run.vm05:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_1 /var/lib/ceph/osd/ceph-0 2026-03-24T10:48:09.704 DEBUG:teuthology.orchestra.run.vm05:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-0 2026-03-24T10:48:09.710 INFO:teuthology.orchestra.run.vm05.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T10:48:09.710 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:48:09.711 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /var/lib/ceph/osd/ceph-1 2026-03-24T10:48:09.758 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T10:48:09.758 INFO:tasks.ceph:role: osd.1 2026-03-24T10:48:09.758 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_2 on ubuntu@vm05.local 2026-03-24T10:48:09.758 DEBUG:teuthology.orchestra.run.vm05:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_2 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout:meta-data=/dev/vg_nvme/lv_2 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout: = sunit=0 swidth=0 blks 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T10:48:09.806 INFO:teuthology.orchestra.run.vm05.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T10:48:09.807 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T10:48:09.807 INFO:teuthology.orchestra.run.vm05.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T10:48:09.811 INFO:teuthology.orchestra.run.vm05.stdout:Discarding blocks...Done. 2026-03-24T10:48:09.812 INFO:tasks.ceph:mount /dev/vg_nvme/lv_2 on ubuntu@vm05.local -o noatime 2026-03-24T10:48:09.812 DEBUG:teuthology.orchestra.run.vm05:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_2 /var/lib/ceph/osd/ceph-1 2026-03-24T10:48:09.872 DEBUG:teuthology.orchestra.run.vm05:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-1 2026-03-24T10:48:09.921 INFO:teuthology.orchestra.run.vm05.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T10:48:09.921 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:48:09.921 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /var/lib/ceph/osd/ceph-2 2026-03-24T10:48:09.969 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-24T10:48:09.969 INFO:tasks.ceph:role: osd.2 2026-03-24T10:48:09.969 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_3 on ubuntu@vm05.local 2026-03-24T10:48:09.969 DEBUG:teuthology.orchestra.run.vm05:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_3 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout:meta-data=/dev/vg_nvme/lv_3 isize=2048 agcount=4, agsize=1310464 blks 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout: = sunit=0 swidth=0 blks 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-24T10:48:10.017 INFO:teuthology.orchestra.run.vm05.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-24T10:48:10.022 INFO:teuthology.orchestra.run.vm05.stdout:Discarding blocks...Done. 2026-03-24T10:48:10.022 INFO:tasks.ceph:mount /dev/vg_nvme/lv_3 on ubuntu@vm05.local -o noatime 2026-03-24T10:48:10.023 DEBUG:teuthology.orchestra.run.vm05:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_3 /var/lib/ceph/osd/ceph-2 2026-03-24T10:48:10.077 DEBUG:teuthology.orchestra.run.vm05:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-2 2026-03-24T10:48:10.124 INFO:teuthology.orchestra.run.vm05.stderr:sudo: /sbin/restorecon: command not found 2026-03-24T10:48:10.124 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:48:10.124 DEBUG:teuthology.orchestra.run.vm05:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 0 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:10.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:10.179+0000 7f44da47da40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-0/keyring: can't open /var/lib/ceph/osd/ceph-0/keyring: (2) No such file or directory 2026-03-24T10:48:10.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:10.179+0000 7f44da47da40 -1 created new key in keyring /var/lib/ceph/osd/ceph-0/keyring 2026-03-24T10:48:10.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:10.179+0000 7f44da47da40 -1 bdev(0x557110b8f800 /var/lib/ceph/osd/ceph-0/block) open stat got: (1) Operation not permitted 2026-03-24T10:48:10.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:10.183+0000 7f44da47da40 -1 bluestore(/var/lib/ceph/osd/ceph-0) _read_fsid unparsable uuid 2026-03-24T10:48:11.066 DEBUG:teuthology.orchestra.run.vm05:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-24T10:48:11.114 DEBUG:teuthology.orchestra.run.vm05:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 1 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:11.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:11.171+0000 7fdc93b26a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-1/keyring: can't open /var/lib/ceph/osd/ceph-1/keyring: (2) No such file or directory 2026-03-24T10:48:11.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:11.171+0000 7fdc93b26a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-1/keyring 2026-03-24T10:48:11.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:11.171+0000 7fdc93b26a40 -1 bdev(0x55bb984d9800 /var/lib/ceph/osd/ceph-1/block) open stat got: (1) Operation not permitted 2026-03-24T10:48:11.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:11.171+0000 7fdc93b26a40 -1 bluestore(/var/lib/ceph/osd/ceph-1) _read_fsid unparsable uuid 2026-03-24T10:48:12.071 DEBUG:teuthology.orchestra.run.vm05:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-24T10:48:12.119 DEBUG:teuthology.orchestra.run.vm05:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 2 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:12.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:12.179+0000 7f005fcb9a40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-2/keyring: can't open /var/lib/ceph/osd/ceph-2/keyring: (2) No such file or directory 2026-03-24T10:48:12.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:12.179+0000 7f005fcb9a40 -1 created new key in keyring /var/lib/ceph/osd/ceph-2/keyring 2026-03-24T10:48:12.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:12.179+0000 7f005fcb9a40 -1 bdev(0x555c217bb800 /var/lib/ceph/osd/ceph-2/block) open stat got: (1) Operation not permitted 2026-03-24T10:48:12.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-24T10:48:12.179+0000 7f005fcb9a40 -1 bluestore(/var/lib/ceph/osd/ceph-2) _read_fsid unparsable uuid 2026-03-24T10:48:13.046 DEBUG:teuthology.orchestra.run.vm05:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-24T10:48:13.095 INFO:tasks.ceph:Reading keys from all nodes... 2026-03-24T10:48:13.095 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:13.095 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/mgr/ceph-x/keyring of=/dev/stdout 2026-03-24T10:48:13.143 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:13.143 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-0/keyring of=/dev/stdout 2026-03-24T10:48:13.191 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:13.191 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-1/keyring of=/dev/stdout 2026-03-24T10:48:13.243 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:13.243 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-2/keyring of=/dev/stdout 2026-03-24T10:48:13.295 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:13.295 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.client.0.keyring of=/dev/stdout 2026-03-24T10:48:13.342 INFO:tasks.ceph:Adding keys to all mons... 2026-03-24T10:48:13.342 DEBUG:teuthology.orchestra.run.vm05:> sudo tee -a /etc/ceph/ceph.keyring 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout:[mgr.x] 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDpa8JproYyGRAAHZoZK8QJMQttuxoPHGTIFg== 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout:[osd.0] 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDqa8JpjuHvChAA7/l659TrX7f7S1QULyTOoQ== 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout:[osd.1] 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDra8JpnFNAChAAw6MvQD8/0kD3cMtqUgGLag== 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout:[osd.2] 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDsa8Jp6X7QChAAbLK1HclfxNO5OKAZDGjfxA== 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout:[client.0] 2026-03-24T10:48:13.390 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDpa8Jp6FIFHRAA57fjKG+sjCScQwzPWeS6Sg== 2026-03-24T10:48:13.391 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=mgr.x --cap mon 'allow profile mgr' --cap osd 'allow *' --cap mds 'allow *' 2026-03-24T10:48:13.454 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.0 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T10:48:13.518 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.1 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T10:48:13.582 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.2 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-24T10:48:13.646 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=client.0 --cap mon 'allow rw' --cap mgr 'allow r' --cap osd 'allow rwx' --cap mds allow 2026-03-24T10:48:13.711 INFO:tasks.ceph:Running mkfs on mon nodes... 2026-03-24T10:48:13.711 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /var/lib/ceph/mon/ceph-a 2026-03-24T10:48:13.763 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i a --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-24T10:48:13.844 DEBUG:teuthology.orchestra.run.vm05:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-a 2026-03-24T10:48:13.895 DEBUG:teuthology.orchestra.run.vm05:> rm -- /home/ubuntu/cephtest/ceph.monmap 2026-03-24T10:48:13.942 INFO:tasks.ceph:Starting mon daemons in cluster ceph... 2026-03-24T10:48:13.942 INFO:tasks.ceph.mon.a:Restarting daemon 2026-03-24T10:48:13.942 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i a 2026-03-24T10:48:13.984 INFO:tasks.ceph.mon.a:Started 2026-03-24T10:48:13.984 INFO:tasks.ceph:Starting mgr daemons in cluster ceph... 2026-03-24T10:48:13.984 INFO:tasks.ceph.mgr.x:Restarting daemon 2026-03-24T10:48:13.984 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mgr -f --cluster ceph -i x 2026-03-24T10:48:13.985 INFO:tasks.ceph.mgr.x:Started 2026-03-24T10:48:13.985 DEBUG:tasks.ceph:set 0 configs 2026-03-24T10:48:13.985 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph config dump 2026-03-24T10:48:14.085 INFO:teuthology.orchestra.run.vm05.stdout:WHO MASK LEVEL OPTION VALUE RO 2026-03-24T10:48:14.098 INFO:tasks.ceph:Setting crush tunables to default 2026-03-24T10:48:14.098 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph osd crush tunables default 2026-03-24T10:48:14.200 INFO:teuthology.orchestra.run.vm05.stderr:adjusted tunables profile to default 2026-03-24T10:48:14.212 INFO:tasks.ceph:check_enable_crimson: False 2026-03-24T10:48:14.212 INFO:tasks.ceph:Starting osd daemons in cluster ceph... 2026-03-24T10:48:14.212 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:14.212 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-0/fsid of=/dev/stdout 2026-03-24T10:48:14.220 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:14.220 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-1/fsid of=/dev/stdout 2026-03-24T10:48:14.271 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:48:14.272 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/osd/ceph-2/fsid of=/dev/stdout 2026-03-24T10:48:14.327 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph osd new 3022b391-fa44-4bc1-b17f-aae1f26aed61 0 2026-03-24T10:48:14.474 INFO:teuthology.orchestra.run.vm05.stdout:0 2026-03-24T10:48:14.487 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph osd new 25e05012-9142-46e5-8250-a717d8c70af9 1 2026-03-24T10:48:14.595 INFO:teuthology.orchestra.run.vm05.stdout:1 2026-03-24T10:48:14.610 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph osd new fc499d35-830c-40c8-b3d2-d0fdaa9d8a65 2 2026-03-24T10:48:14.718 INFO:teuthology.orchestra.run.vm05.stdout:2 2026-03-24T10:48:14.731 INFO:tasks.ceph.osd.0:Restarting daemon 2026-03-24T10:48:14.732 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0 2026-03-24T10:48:14.732 INFO:tasks.ceph.osd.0:Started 2026-03-24T10:48:14.732 INFO:tasks.ceph.osd.1:Restarting daemon 2026-03-24T10:48:14.732 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 1 2026-03-24T10:48:14.733 INFO:tasks.ceph.osd.1:Started 2026-03-24T10:48:14.733 INFO:tasks.ceph.osd.2:Restarting daemon 2026-03-24T10:48:14.733 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 2 2026-03-24T10:48:14.734 INFO:tasks.ceph.osd.2:Started 2026-03-24T10:48:14.734 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T10:48:14.856 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:14.856 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":5,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T10:48:14.710778+0000","last_up_change":"0.000000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T10:48:14.869 INFO:tasks.ceph.ceph_manager.ceph:[] 2026-03-24T10:48:14.869 INFO:tasks.ceph:Waiting for OSDs to come up 2026-03-24T10:48:15.032 INFO:tasks.ceph.osd.1.vm05.stderr:2026-03-24T10:48:15.027+0000 7f9e840bda40 -1 Falling back to public interface 2026-03-24T10:48:15.044 INFO:tasks.ceph.osd.0.vm05.stderr:2026-03-24T10:48:15.039+0000 7fc31d791a40 -1 Falling back to public interface 2026-03-24T10:48:15.112 INFO:tasks.ceph.osd.2.vm05.stderr:2026-03-24T10:48:15.107+0000 7f5388966a40 -1 Falling back to public interface 2026-03-24T10:48:15.170 DEBUG:teuthology.orchestra.run.vm05:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-24T10:48:15.326 INFO:tasks.ceph.mgr.x.vm05.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-24T10:48:15.326 INFO:tasks.ceph.mgr.x.vm05.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-24T10:48:15.326 INFO:tasks.ceph.mgr.x.vm05.stderr: from numpy import show_config as show_numpy_config 2026-03-24T10:48:15.488 INFO:teuthology.misc.health.vm05.stdout: 2026-03-24T10:48:15.488 INFO:teuthology.misc.health.vm05.stdout:{"epoch":5,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T10:48:14.710778+0000","last_up_change":"0.000000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T10:48:15.500 DEBUG:teuthology.misc:0 of 3 OSDs are up 2026-03-24T10:48:15.606 INFO:tasks.ceph.osd.1.vm05.stderr:2026-03-24T10:48:15.599+0000 7f9e840bda40 -1 osd.1 0 log_to_monitors true 2026-03-24T10:48:15.686 INFO:tasks.ceph.osd.0.vm05.stderr:2026-03-24T10:48:15.679+0000 7fc31d791a40 -1 osd.0 0 log_to_monitors true 2026-03-24T10:48:15.777 INFO:tasks.ceph.osd.2.vm05.stderr:2026-03-24T10:48:15.771+0000 7f5388966a40 -1 osd.2 0 log_to_monitors true 2026-03-24T10:48:15.946 INFO:tasks.ceph.mgr.x.vm05.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-24T10:48:17.067 INFO:tasks.ceph.osd.2.vm05.stderr:2026-03-24T10:48:17.063+0000 7f538490f640 -1 osd.2 0 waiting for initial osdmap 2026-03-24T10:48:17.067 INFO:tasks.ceph.osd.0.vm05.stderr:2026-03-24T10:48:17.063+0000 7fc31973a640 -1 osd.0 0 waiting for initial osdmap 2026-03-24T10:48:17.068 INFO:tasks.ceph.osd.1.vm05.stderr:2026-03-24T10:48:17.063+0000 7f9e80066640 -1 osd.1 0 waiting for initial osdmap 2026-03-24T10:48:17.071 INFO:tasks.ceph.osd.0.vm05.stderr:2026-03-24T10:48:17.063+0000 7fc314548640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T10:48:17.071 INFO:tasks.ceph.osd.2.vm05.stderr:2026-03-24T10:48:17.063+0000 7f537f71d640 -1 osd.2 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T10:48:17.072 INFO:tasks.ceph.osd.1.vm05.stderr:2026-03-24T10:48:17.067+0000 7f9e7ae74640 -1 osd.1 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-24T10:48:17.410 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T10:48:17.403+0000 7f1d22725640 -1 mgr.server handle_report got status from non-daemon mon.a 2026-03-24T10:48:21.802 DEBUG:teuthology.orchestra.run.vm05:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-24T10:48:21.967 INFO:teuthology.misc.health.vm05.stdout: 2026-03-24T10:48:21.967 INFO:teuthology.misc.health.vm05.stdout:{"epoch":11,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T10:48:21.410992+0000","last_up_change":"2026-03-24T10:48:18.049402+0000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T10:48:18.414195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6809","nonce":3270659984}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6811","nonce":3270659984}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6815","nonce":3270659984}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6813","nonce":3270659984}]},"public_addr":"192.168.123.105:6809/3270659984","cluster_addr":"192.168.123.105:6811/3270659984","heartbeat_back_addr":"192.168.123.105:6815/3270659984","heartbeat_front_addr":"192.168.123.105:6813/3270659984","state":["exists","up"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":9,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6801","nonce":4104923970}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6803","nonce":4104923970}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6807","nonce":4104923970}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6805","nonce":4104923970}]},"public_addr":"192.168.123.105:6801/4104923970","cluster_addr":"192.168.123.105:6803/4104923970","heartbeat_back_addr":"192.168.123.105:6807/4104923970","heartbeat_front_addr":"192.168.123.105:6805/4104923970","state":["exists","up"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6817","nonce":951022638}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6819","nonce":951022638}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6823","nonce":951022638}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6821","nonce":951022638}]},"public_addr":"192.168.123.105:6817/951022638","cluster_addr":"192.168.123.105:6819/951022638","heartbeat_back_addr":"192.168.123.105:6823/951022638","heartbeat_front_addr":"192.168.123.105:6821/951022638","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T10:48:21.981 DEBUG:teuthology.misc:3 of 3 OSDs are up 2026-03-24T10:48:21.981 INFO:tasks.ceph:Creating RBD pool 2026-03-24T10:48:21.981 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph osd pool create rbd 8 2026-03-24T10:48:22.424 INFO:teuthology.orchestra.run.vm05.stderr:pool 'rbd' created 2026-03-24T10:48:22.444 DEBUG:teuthology.orchestra.run.vm05:> rbd --cluster ceph pool init rbd 2026-03-24T10:48:25.456 INFO:tasks.ceph:Starting mds daemons in cluster ceph... 2026-03-24T10:48:25.457 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph config log 1 --format=json 2026-03-24T10:48:25.457 INFO:tasks.daemonwatchdog.daemon_watchdog:watchdog starting 2026-03-24T10:48:25.633 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:25.647 INFO:teuthology.orchestra.run.vm05.stdout:[{"version":1,"timestamp":"0.000000","name":"","changes":[]}] 2026-03-24T10:48:25.647 INFO:tasks.ceph_manager:config epoch is 1 2026-03-24T10:48:25.647 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-24T10:48:25.647 INFO:tasks.ceph.ceph_manager.ceph:waiting for mgr available 2026-03-24T10:48:25.647 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph mgr dump --format=json 2026-03-24T10:48:25.847 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:25.861 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":5,"flags":0,"active_gid":4105,"active_name":"x","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":417429612},{"type":"v1","addr":"192.168.123.105:6825","nonce":417429612}]},"active_addr":"192.168.123.105:6825/417429612","active_change":"2026-03-24T10:48:16.395492+0000","active_mgr_features":4544132024016699391,"available":true,"standbys":[],"modules":["iostat","nfs"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to, use commas to separate multiple","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","read","upmap","upmap-read"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_pg_upmap_activity":{"name":"update_pg_upmap_activity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Updates pg_upmap activity stats to be used in `balancer status detail`","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cephadm_log_destination":{"name":"cephadm_log_destination","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":["file","file,syslog","syslog"],"desc":"Destination for cephadm command's persistent logging","long_desc":"","tags":[],"see_also":[]},"certificate_automated_rotation_enabled":{"name":"certificate_automated_rotation_enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"This flag controls whether cephadm automatically rotates certificates upon expiration.","long_desc":"","tags":[],"see_also":[]},"certificate_check_debug_mode":{"name":"certificate_check_debug_mode","type":"bool","level":"dev","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.","long_desc":"","tags":[],"see_also":[]},"certificate_check_period":{"name":"certificate_check_period","type":"int","level":"advanced","flags":0,"default_value":"1","min":"0","max":"30","enum_allowed":[],"desc":"Specifies how often (in days) the certificate should be checked for validity.","long_desc":"","tags":[],"see_also":[]},"certificate_duration_days":{"name":"certificate_duration_days","type":"int","level":"advanced","flags":0,"default_value":"1095","min":"90","max":"3650","enum_allowed":[],"desc":"Specifies the duration of self certificates generated and signed by cephadm root CA","long_desc":"","tags":[],"see_also":[]},"certificate_renewal_threshold_days":{"name":"certificate_renewal_threshold_days","type":"int","level":"advanced","flags":0,"default_value":"30","min":"10","max":"90","enum_allowed":[],"desc":"Specifies the lead time in days to initiate certificate renewal before expiration.","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.28.1","min":"","max":"","enum_allowed":[],"desc":"Alertmanager container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"Elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/grafana:12.3.1","min":"","max":"","enum_allowed":[],"desc":"Grafana container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"Haproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_nginx":{"name":"container_image_nginx","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nginx:sclorg-nginx-126","min":"","max":"","enum_allowed":[],"desc":"Nginx container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.9.1","min":"","max":"","enum_allowed":[],"desc":"Node exporter container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.5","min":"","max":"","enum_allowed":[],"desc":"Nvmeof container image","long_desc":"","tags":[],"see_also":[]},"container_image_oauth2_proxy":{"name":"container_image_oauth2_proxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/oauth2-proxy/oauth2-proxy:v7.6.0","min":"","max":"","enum_allowed":[],"desc":"Oauth2 proxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v3.6.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba":{"name":"container_image_samba","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-server:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba_metrics":{"name":"container_image_samba_metrics","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-metrics:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba metrics container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"Snmp gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"grafana_dashboards_path":{"name":"grafana_dashboards_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/grafana/dashboards/ceph-dashboard/","min":"","max":"","enum_allowed":[],"desc":"location of dashboards to include in grafana deployments","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_count_max":{"name":"ssh_keepalive_count_max","type":"int","level":"advanced","flags":0,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"How many times ssh connections can fail liveness checks before the host is marked offline","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_interval":{"name":"ssh_keepalive_interval","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"How often ssh connections are checked for liveness","long_desc":"","tags":[],"see_also":[]},"stray_daemon_check_interval":{"name":"stray_daemon_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"how frequently cephadm should check for the presence of stray daemons","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MANAGED_BY_CLUSTERS":{"name":"MANAGED_BY_CLUSTERS","type":"str","level":"advanced","flags":0,"default_value":"[]","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MULTICLUSTER_CONFIG":{"name":"MULTICLUSTER_CONFIG","type":"str","level":"advanced","flags":0,"default_value":"{}","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROM_ALERT_CREDENTIAL_CACHE_TTL":{"name":"PROM_ALERT_CREDENTIAL_CACHE_TTL","type":"int","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_HOSTNAME_PER_DAEMON":{"name":"RGW_HOSTNAME_PER_DAEMON","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crypto_caller":{"name":"crypto_caller","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sso_oauth2":{"name":"sso_oauth2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"squid":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"tentacle":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":0,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":2869254057}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":2624520912}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":4260664610}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":3895405096}]}]} 2026-03-24T10:48:25.862 INFO:tasks.ceph.ceph_manager.ceph:mgr available! 2026-03-24T10:48:25.862 INFO:tasks.ceph.ceph_manager.ceph:waiting for all up 2026-03-24T10:48:25.862 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T10:48:26.030 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:26.030 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":15,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T10:48:25.428651+0000","last_up_change":"2026-03-24T10:48:18.049402+0000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T10:48:18.414195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T10:48:22.140928+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6809","nonce":3270659984}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6811","nonce":3270659984}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6815","nonce":3270659984}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6813","nonce":3270659984}]},"public_addr":"192.168.123.105:6809/3270659984","cluster_addr":"192.168.123.105:6811/3270659984","heartbeat_back_addr":"192.168.123.105:6815/3270659984","heartbeat_front_addr":"192.168.123.105:6813/3270659984","state":["exists","up"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6801","nonce":4104923970}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6803","nonce":4104923970}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6807","nonce":4104923970}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6805","nonce":4104923970}]},"public_addr":"192.168.123.105:6801/4104923970","cluster_addr":"192.168.123.105:6803/4104923970","heartbeat_back_addr":"192.168.123.105:6807/4104923970","heartbeat_front_addr":"192.168.123.105:6805/4104923970","state":["exists","up"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6817","nonce":951022638}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6819","nonce":951022638}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6823","nonce":951022638}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6821","nonce":951022638}]},"public_addr":"192.168.123.105:6817/951022638","cluster_addr":"192.168.123.105:6819/951022638","heartbeat_back_addr":"192.168.123.105:6823/951022638","heartbeat_front_addr":"192.168.123.105:6821/951022638","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T10:48:26.044 INFO:tasks.ceph.ceph_manager.ceph:all up! 2026-03-24T10:48:26.044 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T10:48:26.214 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:26.214 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":15,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T10:48:25.428651+0000","last_up_change":"2026-03-24T10:48:18.049402+0000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T10:48:18.414195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T10:48:22.140928+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6809","nonce":3270659984}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6811","nonce":3270659984}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6815","nonce":3270659984}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6813","nonce":3270659984}]},"public_addr":"192.168.123.105:6809/3270659984","cluster_addr":"192.168.123.105:6811/3270659984","heartbeat_back_addr":"192.168.123.105:6815/3270659984","heartbeat_front_addr":"192.168.123.105:6813/3270659984","state":["exists","up"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6801","nonce":4104923970}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6803","nonce":4104923970}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6807","nonce":4104923970}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6805","nonce":4104923970}]},"public_addr":"192.168.123.105:6801/4104923970","cluster_addr":"192.168.123.105:6803/4104923970","heartbeat_back_addr":"192.168.123.105:6807/4104923970","heartbeat_front_addr":"192.168.123.105:6805/4104923970","state":["exists","up"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6817","nonce":951022638}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6819","nonce":951022638}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6823","nonce":951022638}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6821","nonce":951022638}]},"public_addr":"192.168.123.105:6817/951022638","cluster_addr":"192.168.123.105:6819/951022638","heartbeat_back_addr":"192.168.123.105:6823/951022638","heartbeat_front_addr":"192.168.123.105:6821/951022638","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T10:48:26.229 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 flush_pg_stats 2026-03-24T10:48:26.229 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 flush_pg_stats 2026-03-24T10:48:26.229 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 flush_pg_stats 2026-03-24T10:48:26.326 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.327 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-24T10:48:26.335 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.335 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-24T10:48:26.340 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.340 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-24T10:48:26.527 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.542 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.0 2026-03-24T10:48:26.542 DEBUG:teuthology.parallel:result is None 2026-03-24T10:48:26.561 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.575 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.1 2026-03-24T10:48:26.575 DEBUG:teuthology.parallel:result is None 2026-03-24T10:48:26.580 INFO:teuthology.orchestra.run.vm05.stdout:34359738371 2026-03-24T10:48:26.593 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.2 2026-03-24T10:48:26.593 DEBUG:teuthology.parallel:result is None 2026-03-24T10:48:26.593 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-24T10:48:26.593 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T10:48:26.802 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:26.802 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T10:48:26.817 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-24T10:48:26.402582+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":15,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81392,"kb_used_data":856,"kb_used_omap":19,"kb_used_meta":80428,"kb_avail":283034128,"statfs":{"total":289910292480,"available":289826947072,"internally_reserved":0,"allocated":876544,"data_stored":1029399,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":20462,"internal_metadata":82358290},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.977300"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443235+0000","last_change":"2026-03-24T10:48:25.443561+0000","last_active":"2026-03-24T10:48:25.443235+0000","last_peered":"2026-03-24T10:48:25.443235+0000","last_clean":"2026-03-24T10:48:25.443235+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T10:48:25.443235+0000","last_undegraded":"2026-03-24T10:48:25.443235+0000","last_fullsized":"2026-03-24T10:48:25.443235+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:10:48.363404+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00089274000000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442830+0000","last_change":"2026-03-24T10:48:25.442965+0000","last_active":"2026-03-24T10:48:25.442830+0000","last_peered":"2026-03-24T10:48:25.442830+0000","last_clean":"2026-03-24T10:48:25.442830+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T10:48:25.442830+0000","last_undegraded":"2026-03-24T10:48:25.442830+0000","last_fullsized":"2026-03-24T10:48:25.442830+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:58:50.863143+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040477699999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443407+0000","last_change":"2026-03-24T10:48:25.443496+0000","last_active":"2026-03-24T10:48:25.443407+0000","last_peered":"2026-03-24T10:48:25.443407+0000","last_clean":"2026-03-24T10:48:25.443407+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T10:48:25.443407+0000","last_undegraded":"2026-03-24T10:48:25.443407+0000","last_fullsized":"2026-03-24T10:48:25.443407+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:52:51.949355+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00077010000000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442793+0000","last_change":"2026-03-24T10:48:25.442963+0000","last_active":"2026-03-24T10:48:25.442793+0000","last_peered":"2026-03-24T10:48:25.442793+0000","last_clean":"2026-03-24T10:48:25.442793+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T10:48:25.442793+0000","last_undegraded":"2026-03-24T10:48:25.442793+0000","last_fullsized":"2026-03-24T10:48:25.442793+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:56:41.890812+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00041283200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.448020+0000","last_change":"2026-03-24T10:48:25.448020+0000","last_active":"2026-03-24T10:48:25.448020+0000","last_peered":"2026-03-24T10:48:25.448020+0000","last_clean":"2026-03-24T10:48:25.448020+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T10:48:25.448020+0000","last_undegraded":"2026-03-24T10:48:25.448020+0000","last_fullsized":"2026-03-24T10:48:25.448020+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T15:28:31.147114+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00058253899999999997,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.701106+0000","last_change":"2026-03-24T10:48:25.701236+0000","last_active":"2026-03-24T10:48:25.701106+0000","last_peered":"2026-03-24T10:48:25.701106+0000","last_clean":"2026-03-24T10:48:25.701106+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T10:48:25.701106+0000","last_undegraded":"2026-03-24T10:48:25.701106+0000","last_fullsized":"2026-03-24T10:48:25.701106+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T22:47:06.640036+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024831499999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.701266+0000","last_change":"2026-03-24T10:48:25.701388+0000","last_active":"2026-03-24T10:48:25.701266+0000","last_peered":"2026-03-24T10:48:25.701266+0000","last_clean":"2026-03-24T10:48:25.701266+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T10:48:25.701266+0000","last_undegraded":"2026-03-24T10:48:25.701266+0000","last_fullsized":"2026-03-24T10:48:25.701266+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:05:31.194855+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000308537,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443005+0000","last_change":"2026-03-24T10:48:25.443081+0000","last_active":"2026-03-24T10:48:25.443005+0000","last_peered":"2026-03-24T10:48:25.443005+0000","last_clean":"2026-03-24T10:48:25.443005+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T10:48:25.443005+0000","last_undegraded":"2026-03-24T10:48:25.443005+0000","last_fullsized":"2026-03-24T10:48:25.443005+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:07:42.895533+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000468486,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442748+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T10:48:25.442748+0000","last_peered":"2026-03-24T10:48:25.442748+0000","last_clean":"2026-03-24T10:48:25.442748+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T10:48:25.442748+0000","last_undegraded":"2026-03-24T10:48:25.442748+0000","last_fullsized":"2026-03-24T10:48:25.442748+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34563,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7470,"internal_metadata":27452114},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":4,"kb_used_meta":26811,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":4221,"internal_metadata":27455363},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-24T10:48:26.817 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T10:48:26.987 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:26.988 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T10:48:27.001 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-24T10:48:26.402582+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":15,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81392,"kb_used_data":856,"kb_used_omap":19,"kb_used_meta":80428,"kb_avail":283034128,"statfs":{"total":289910292480,"available":289826947072,"internally_reserved":0,"allocated":876544,"data_stored":1029399,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":20462,"internal_metadata":82358290},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.977300"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443235+0000","last_change":"2026-03-24T10:48:25.443561+0000","last_active":"2026-03-24T10:48:25.443235+0000","last_peered":"2026-03-24T10:48:25.443235+0000","last_clean":"2026-03-24T10:48:25.443235+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T10:48:25.443235+0000","last_undegraded":"2026-03-24T10:48:25.443235+0000","last_fullsized":"2026-03-24T10:48:25.443235+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:10:48.363404+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00089274000000000005,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442830+0000","last_change":"2026-03-24T10:48:25.442965+0000","last_active":"2026-03-24T10:48:25.442830+0000","last_peered":"2026-03-24T10:48:25.442830+0000","last_clean":"2026-03-24T10:48:25.442830+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T10:48:25.442830+0000","last_undegraded":"2026-03-24T10:48:25.442830+0000","last_fullsized":"2026-03-24T10:48:25.442830+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:58:50.863143+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00040477699999999999,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443407+0000","last_change":"2026-03-24T10:48:25.443496+0000","last_active":"2026-03-24T10:48:25.443407+0000","last_peered":"2026-03-24T10:48:25.443407+0000","last_clean":"2026-03-24T10:48:25.443407+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T10:48:25.443407+0000","last_undegraded":"2026-03-24T10:48:25.443407+0000","last_fullsized":"2026-03-24T10:48:25.443407+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:52:51.949355+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00077010000000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442793+0000","last_change":"2026-03-24T10:48:25.442963+0000","last_active":"2026-03-24T10:48:25.442793+0000","last_peered":"2026-03-24T10:48:25.442793+0000","last_clean":"2026-03-24T10:48:25.442793+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T10:48:25.442793+0000","last_undegraded":"2026-03-24T10:48:25.442793+0000","last_fullsized":"2026-03-24T10:48:25.442793+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T20:56:41.890812+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00041283200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.448020+0000","last_change":"2026-03-24T10:48:25.448020+0000","last_active":"2026-03-24T10:48:25.448020+0000","last_peered":"2026-03-24T10:48:25.448020+0000","last_clean":"2026-03-24T10:48:25.448020+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T10:48:25.448020+0000","last_undegraded":"2026-03-24T10:48:25.448020+0000","last_fullsized":"2026-03-24T10:48:25.448020+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T15:28:31.147114+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00058253899999999997,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.701106+0000","last_change":"2026-03-24T10:48:25.701236+0000","last_active":"2026-03-24T10:48:25.701106+0000","last_peered":"2026-03-24T10:48:25.701106+0000","last_clean":"2026-03-24T10:48:25.701106+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T10:48:25.701106+0000","last_undegraded":"2026-03-24T10:48:25.701106+0000","last_fullsized":"2026-03-24T10:48:25.701106+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T22:47:06.640036+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00024831499999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.701266+0000","last_change":"2026-03-24T10:48:25.701388+0000","last_active":"2026-03-24T10:48:25.701266+0000","last_peered":"2026-03-24T10:48:25.701266+0000","last_clean":"2026-03-24T10:48:25.701266+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T10:48:25.701266+0000","last_undegraded":"2026-03-24T10:48:25.701266+0000","last_fullsized":"2026-03-24T10:48:25.701266+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:05:31.194855+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000308537,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.443005+0000","last_change":"2026-03-24T10:48:25.443081+0000","last_active":"2026-03-24T10:48:25.443005+0000","last_peered":"2026-03-24T10:48:25.443005+0000","last_clean":"2026-03-24T10:48:25.443005+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T10:48:25.443005+0000","last_undegraded":"2026-03-24T10:48:25.443005+0000","last_fullsized":"2026-03-24T10:48:25.443005+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:07:42.895533+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000468486,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-24T10:48:25.442748+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T10:48:25.442748+0000","last_peered":"2026-03-24T10:48:25.442748+0000","last_clean":"2026-03-24T10:48:25.442748+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T10:48:25.442748+0000","last_undegraded":"2026-03-24T10:48:25.442748+0000","last_fullsized":"2026-03-24T10:48:25.442748+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26976,"kb_used_data":120,"kb_used_omap":7,"kb_used_meta":26808,"kb_avail":94344864,"statfs":{"total":96636764160,"available":96609140736,"internally_reserved":0,"allocated":122880,"data_stored":34563,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7470,"internal_metadata":27452114},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27208,"kb_used_data":368,"kb_used_omap":4,"kb_used_meta":26811,"kb_avail":94344632,"statfs":{"total":96636764160,"available":96608903168,"internally_reserved":0,"allocated":376832,"data_stored":497418,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":4221,"internal_metadata":27455363},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-24T10:48:27.002 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-24T10:48:27.002 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-24T10:48:27.002 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy 2026-03-24T10:48:27.002 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph health --format=json 2026-03-24T10:48:27.189 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T10:48:27.189 INFO:teuthology.orchestra.run.vm05.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-24T10:48:27.203 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy done 2026-03-24T10:48:27.203 INFO:teuthology.run_tasks:Running task workunit... 2026-03-24T10:48:27.207 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-24T10:48:27.208 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-24T10:48:27.208 DEBUG:teuthology.orchestra.run.vm05:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-24T10:48:27.211 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T10:48:27.211 INFO:teuthology.orchestra.run.vm05.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-24T10:48:27.211 DEBUG:teuthology.orchestra.run.vm05:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-24T10:48:27.258 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-24T10:48:27.258 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-24T10:48:27.302 INFO:tasks.workunit:timeout=3h 2026-03-24T10:48:27.302 INFO:tasks.workunit:cleanup=True 2026-03-24T10:48:27.303 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-24T10:48:27.348 INFO:tasks.workunit.client.0.vm05.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:state without impacting any branches by switching back to a branch. 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: git switch -c 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:Or undo this operation with: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: git switch - 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T10:49:26.046 INFO:tasks.workunit.client.0.vm05.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-24T10:49:26.053 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-24T10:49:26.098 INFO:tasks.workunit.client.0.vm05.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-24T10:49:26.099 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-24T10:49:26.100 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-24T10:49:26.140 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-24T10:49:26.171 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-24T10:49:26.198 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-24T10:49:26.199 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-24T10:49:26.199 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-24T10:49:26.224 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-24T10:49:26.227 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-24T10:49:26.227 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-24T10:49:26.270 INFO:tasks.workunit:Running workunits matching rbd/cli_generic.sh on client.0... 2026-03-24T10:49:26.271 INFO:tasks.workunit:Running workunit rbd/cli_generic.sh... 2026-03-24T10:49:26.271 DEBUG:teuthology.orchestra.run.vm05:workunit test rbd/cli_generic.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/rbd/cli_generic.sh 2026-03-24T10:49:26.317 INFO:tasks.workunit.client.0.vm05.stderr:+ export RBD_FORCE_ALLOW_V1=1 2026-03-24T10:49:26.317 INFO:tasks.workunit.client.0.vm05.stderr:+ RBD_FORCE_ALLOW_V1=1 2026-03-24T10:49:26.317 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:26.317 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:49:26.317 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v '^0$' 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ IMGS='testimg1 testimg2 testimg3 testimg4 testimg5 testimg6 testimg-diff1 testimg-diff2 testimg-diff3 foo foo2 bar bar2 test1 test2 test3 test4 clone2' 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ tiered=0 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd dump 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^pool' 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ''\''rbd'\''' 2026-03-24T10:49:26.340 INFO:tasks.workunit.client.0.vm05.stderr:+ grep tier 2026-03-24T10:49:26.561 INFO:tasks.workunit.client.0.vm05.stderr:+ test_pool_image_args 2026-03-24T10:49:26.561 INFO:tasks.workunit.client.0.vm05.stdout:testing pool and image args... 2026-03-24T10:49:26.561 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing pool and image args...' 2026-03-24T10:49:26.561 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T10:49:26.561 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:26.616 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:26.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:26.948 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.003 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.059 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.116 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.171 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.284 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.340 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.399 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.454 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.509 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.565 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.621 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.678 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.742 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:27.798 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-24T10:49:28.004 INFO:tasks.workunit.client.0.vm05.stderr:pool 'test' does not exist 2026-03-24T10:49:28.017 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create test 32 2026-03-24T10:49:28.762 INFO:tasks.workunit.client.0.vm05.stderr:pool 'test' already exists 2026-03-24T10:49:28.775 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init test 2026-03-24T10:49:31.727 INFO:tasks.workunit.client.0.vm05.stderr:+ truncate -s 1 /tmp/empty /tmp/empty@snap 2026-03-24T10:49:31.728 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:31.728 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:49:31.728 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T10:49:31.750 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T10:49:31.750 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 1 test1 2026-03-24T10:49:31.770 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:31.763+0000 7f1d98a76200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:33.792 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:33.792 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test1 2026-03-24T10:49:33.814 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import --image test2 /tmp/empty 2026-03-24T10:49:33.835 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:33.827+0000 7f563d26a200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:33.841 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:33.842 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:33.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:33.845 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test2 2026-03-24T10:49:33.867 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --dest test3 import /tmp/empty 2026-03-24T10:49:33.888 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:33.883+0000 7f59a8752200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:34.124 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:34.128 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:34.128 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test3 2026-03-24T10:49:34.153 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import /tmp/empty foo 2026-03-24T10:49:34.173 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:34.167+0000 7f850d15b200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:34.349 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:34.353 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:34.353 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q foo 2026-03-24T10:49:34.379 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import --dest test/empty@snap /tmp/empty 2026-03-24T10:49:34.394 INFO:tasks.workunit.client.0.vm05.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T10:49:34.396 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T10:49:34.396 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import /tmp/empty test/empty@snap 2026-03-24T10:49:34.411 INFO:tasks.workunit.client.0.vm05.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T10:49:34.413 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T10:49:34.413 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import --image test/empty@snap /tmp/empty 2026-03-24T10:49:34.428 INFO:tasks.workunit.client.0.vm05.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T10:49:34.428 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:34.430 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T10:49:34.430 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import /tmp/empty@snap 2026-03-24T10:49:34.445 INFO:tasks.workunit.client.0.vm05.stderr:rbd: destination snapshot name specified for a command that doesn't use it 2026-03-24T10:49:34.447 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T10:49:34.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:34.447 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:49:34.447 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T10:49:34.472 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T10:49:34.472 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import /tmp/empty test/test1 2026-03-24T10:49:34.495 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:34.487+0000 7f8d2422e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:35.858 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:35.862 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:35.862 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test1 2026-03-24T10:49:35.886 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p test import /tmp/empty test2 2026-03-24T10:49:35.910 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:35.903+0000 7f263a2a0200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:35.959 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:35.960 INFO:tasks.workunit.client.0.vm05.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T10:49:35.963 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:35.963 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test2 2026-03-24T10:49:35.986 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --image test3 -p test import /tmp/empty 2026-03-24T10:49:36.009 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.003+0000 7efe6e9e5200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.198 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.199 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:36.199 INFO:tasks.workunit.client.0.vm05.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T10:49:36.202 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.202 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test3 2026-03-24T10:49:36.226 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --image test4 -p test import /tmp/empty 2026-03-24T10:49:36.247 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.239+0000 7f8576017200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.258 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.258 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:36.259 INFO:tasks.workunit.client.0.vm05.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T10:49:36.262 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.262 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test4 2026-03-24T10:49:36.288 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --dest test5 -p test import /tmp/empty 2026-03-24T10:49:36.309 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.303+0000 7f7bc4dda200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.376 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.376 INFO:tasks.workunit.client.0.vm05.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T10:49:36.380 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.380 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test5 2026-03-24T10:49:36.407 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --dest test6 --dest-pool test import /tmp/empty 2026-03-24T10:49:36.432 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.427+0000 7fed59e2b200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.440 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.443 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.444 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test6 2026-03-24T10:49:36.469 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --image test7 --dest-pool test import /tmp/empty 2026-03-24T10:49:36.492 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.487+0000 7fd9e403a200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.499 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.500 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:36.503 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.503 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test7 2026-03-24T10:49:36.527 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --image test/test8 import /tmp/empty 2026-03-24T10:49:36.548 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.543+0000 7f8468da7200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.555 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.556 INFO:tasks.workunit.client.0.vm05.stderr:rbd: --image is deprecated, use --dest 2026-03-24T10:49:36.559 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.559 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test8 2026-03-24T10:49:36.585 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --dest test/test9 import /tmp/empty 2026-03-24T10:49:36.607 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.599+0000 7fcebe921200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.615 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.618 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.618 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test9 2026-03-24T10:49:36.642 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import --pool test /tmp/empty 2026-03-24T10:49:36.665 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:36.659+0000 7f7a23e91200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:36.672 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T10:49:36.673 INFO:tasks.workunit.client.0.vm05.stderr:rbd: -p [ --pool ] is deprecated, use --dest-pool 2026-03-24T10:49:36.676 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.676 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q empty 2026-03-24T10:49:36.701 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy test/test9 test10 2026-03-24T10:49:36.734 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T10:49:36.738 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.738 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -qv test10 2026-03-24T10:49:36.761 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:36.761 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test10 2026-03-24T10:49:36.784 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy test/test9 test/test10 2026-03-24T10:49:36.817 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T10:49:36.821 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.821 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test10 2026-03-24T10:49:36.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy --pool test test10 --dest-pool test test11 2026-03-24T10:49:36.880 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T10:49:36.884 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.884 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -q test11 2026-03-24T10:49:36.907 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy --dest-pool rbd --pool test test11 test12 2026-03-24T10:49:36.942 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T10:49:36.946 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:36.946 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test12 2026-03-24T10:49:36.968 INFO:tasks.workunit.client.0.vm05.stdout:test12 2026-03-24T10:49:36.968 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls test 2026-03-24T10:49:36.968 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -qv test12 2026-03-24T10:49:36.991 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -f /tmp/empty /tmp/empty@snap 2026-03-24T10:49:36.991 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool delete test test --yes-i-really-really-mean-it 2026-03-24T10:49:37.912 INFO:tasks.workunit.client.0.vm05.stderr:pool 'test' does not exist 2026-03-24T10:49:37.924 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:37.924 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm foo 2026-03-24T10:49:37.965 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:37.969 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:37.969 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T10:49:37.997 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:38.001 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:38.001 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test10 2026-03-24T10:49:38.047 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:38.050 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:38.050 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test12 2026-03-24T10:49:38.097 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:38.101 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:38.101 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T10:49:38.128 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:38.132 INFO:tasks.workunit.client.0.vm05.stderr:+ for f in foo test1 test10 test12 test2 test3 2026-03-24T10:49:38.132 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test3 2026-03-24T10:49:38.160 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:38.164 INFO:tasks.workunit.client.0.vm05.stdout:testing rename... 2026-03-24T10:49:38.164 INFO:tasks.workunit.client.0.vm05.stderr:+ test_rename 2026-03-24T10:49:38.164 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing rename...' 2026-03-24T10:49:38.164 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T10:49:38.164 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.219 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.479 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.536 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.591 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.648 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.704 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.759 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.816 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.874 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.928 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:38.985 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.040 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.095 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.152 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.206 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.260 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.315 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:39.369 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 foo 2026-03-24T10:49:39.384 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T10:49:39.391 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:39.383+0000 7f7fdbc53200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:39.397 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 bar 2026-03-24T10:49:39.426 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename foo foo2 2026-03-24T10:49:39.460 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename foo2 bar 2026-03-24T10:49:39.460 INFO:tasks.workunit.client.0.vm05.stderr:+ grep exists 2026-03-24T10:49:39.486 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24T10:49:39.475+0000 7fd727bb2200 -1 librbd::Operations: rbd image bar already exists 2026-03-24T10:49:39.486 INFO:tasks.workunit.client.0.vm05.stdout:rbd: rename error: (17) File exists 2026-03-24T10:49:39.486 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename bar bar2 2026-03-24T10:49:39.524 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename bar2 foo2 2026-03-24T10:49:39.524 INFO:tasks.workunit.client.0.vm05.stderr:+ grep exists 2026-03-24T10:49:39.555 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24T10:49:39.543+0000 7f1a87f8e200 -1 librbd::Operations: rbd image foo2 already exists 2026-03-24T10:49:39.555 INFO:tasks.workunit.client.0.vm05.stdout:rbd: rename error: (17) File exists 2026-03-24T10:49:39.556 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T10:49:39.918 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T10:49:39.930 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T10:49:42.901 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -p rbd2 -s 1 foo 2026-03-24T10:49:42.922 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:42.915+0000 7fc47dcf0200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:44.049 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename rbd2/foo rbd2/bar 2026-03-24T10:49:44.080 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p rbd2 ls 2026-03-24T10:49:44.080 INFO:tasks.workunit.client.0.vm05.stderr:+ grep bar 2026-03-24T10:49:44.104 INFO:tasks.workunit.client.0.vm05.stdout:bar 2026-03-24T10:49:44.105 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename rbd2/bar foo 2026-03-24T10:49:44.138 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename --pool rbd2 foo bar 2026-03-24T10:49:44.174 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename rbd2/bar --dest-pool rbd foo 2026-03-24T10:49:44.190 INFO:tasks.workunit.client.0.vm05.stderr:rbd: mv/rename across pools not supported 2026-03-24T10:49:44.191 INFO:tasks.workunit.client.0.vm05.stderr:source pool: rbd2 dest pool: rbd 2026-03-24T10:49:44.192 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rename --pool rbd2 bar --dest-pool rbd2 foo 2026-03-24T10:49:44.228 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p rbd2 ls 2026-03-24T10:49:44.228 INFO:tasks.workunit.client.0.vm05.stderr:+ grep foo 2026-03-24T10:49:44.250 INFO:tasks.workunit.client.0.vm05.stdout:foo 2026-03-24T10:49:44.251 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T10:49:45.092 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T10:49:45.104 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T10:49:45.104 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.165 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.218 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.277 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.337 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.394 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.651 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.910 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:45.967 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.025 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.085 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.146 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.202 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.493 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.548 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.604 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.663 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.724 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:46.985 INFO:tasks.workunit.client.0.vm05.stdout:testing ls... 2026-03-24T10:49:46.985 INFO:tasks.workunit.client.0.vm05.stderr:+ test_ls 2026-03-24T10:49:46.985 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing ls...' 2026-03-24T10:49:46.985 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T10:49:46.985 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.043 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.103 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.163 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.223 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.282 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.339 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.402 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.472 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.747 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.807 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.864 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.926 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:47.984 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:48.042 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:48.153 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:48.210 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:48.265 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:49:48.320 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T10:49:48.334 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T10:49:48.340 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:48.335+0000 7f6cb58b6200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:48.344 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-24T10:49:48.358 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T10:49:48.363 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:48.359+0000 7ffbf7e8e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:49:48.368 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.368 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T10:49:48.388 INFO:tasks.workunit.client.0.vm05.stdout:test1 2026-03-24T10:49:48.388 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.388 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T10:49:48.411 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T10:49:48.411 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.411 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:49:48.411 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T10:49:48.433 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T10:49:48.433 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:49:48.433 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*1 MiB.*1' 2026-03-24T10:49:48.457 INFO:tasks.workunit.client.0.vm05.stdout:test1 1 MiB 1 2026-03-24T10:49:48.457 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:49:48.457 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-24T10:49:48.484 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 1 2026-03-24T10:49:48.484 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T10:49:48.512 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:48.516 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T10:49:48.543 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:49:48.546 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T10:49:48.577 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T10:49:48.609 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.609 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T10:49:48.632 INFO:tasks.workunit.client.0.vm05.stdout:test1 2026-03-24T10:49:48.632 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.632 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T10:49:48.654 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T10:49:48.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:49:48.655 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:49:48.655 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T10:49:48.678 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T10:49:48.678 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:49:48.678 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-24T10:49:48.708 INFO:tasks.workunit.client.0.vm05.stdout:test1 1 MiB 2 2026-03-24T10:49:48.708 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:49:48.708 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*1 MiB.*2' 2026-03-24T10:49:48.735 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 2 2026-03-24T10:49:48.736 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T10:49:48.789 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:49:48.783+0000 7f930ec9d640 0 -- 192.168.123.105:0/3257789362 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f92f005bd40 msgr2=0x7f92f007c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:33.756 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:33.751+0000 7f930ec9d640 0 -- 192.168.123.105:0/3257789362 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f92e8008d30 msgr2=0x7f92f007c770 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:33.821 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:33.825 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T10:51:33.881 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:33.875+0000 7f5b1aa15640 0 -- 192.168.123.105:0/3534313546 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x558a911f3320 msgr2=0x558a911d1790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:33.883 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:33.887 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T10:51:33.917 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 test2 2026-03-24T10:51:33.931 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T10:51:33.938 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:33.931+0000 7ff57aa85200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:33.944 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:51:33.944 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T10:51:33.964 INFO:tasks.workunit.client.0.vm05.stdout:test1 2026-03-24T10:51:33.965 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:51:33.965 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T10:51:33.985 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T10:51:33.986 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:51:33.986 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:51:33.986 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T10:51:34.007 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T10:51:34.007 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:51:34.008 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*1 MiB.*2' 2026-03-24T10:51:34.036 INFO:tasks.workunit.client.0.vm05.stdout:test1 1 MiB 2 2026-03-24T10:51:34.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:51:34.036 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*1 MiB.*1' 2026-03-24T10:51:34.063 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 1 2026-03-24T10:51:34.063 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T10:51:34.063 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.116 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.171 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.223 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.276 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.327 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.379 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.431 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.484 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.536 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.588 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.647 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.701 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.753 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.840 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.897 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:34.949 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:35.001 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T10:51:35.053 INFO:tasks.workunit.client.0.vm05.stderr:++ seq -w 00 99 2026-03-24T10:51:35.053 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.053 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.00 -s 1 2026-03-24T10:51:35.073 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.067+0000 7f4eb7fef200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.078 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.078 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.01 -s 1 2026-03-24T10:51:35.097 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.091+0000 7fb8fc961200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.102 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.102 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.02 -s 1 2026-03-24T10:51:35.121 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.115+0000 7f1144986200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.126 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.126 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.03 -s 1 2026-03-24T10:51:35.146 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.139+0000 7f1fc34e2200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.151 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.151 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.04 -s 1 2026-03-24T10:51:35.171 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.163+0000 7f1e08457200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.176 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.05 -s 1 2026-03-24T10:51:35.195 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.187+0000 7f2744031200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.200 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.200 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.06 -s 1 2026-03-24T10:51:35.219 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.211+0000 7f1f4cf5e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.224 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.224 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.07 -s 1 2026-03-24T10:51:35.245 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.239+0000 7fa59e963200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.250 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.250 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.08 -s 1 2026-03-24T10:51:35.270 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.263+0000 7fd3ada9d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.275 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.275 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.09 -s 1 2026-03-24T10:51:35.295 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.287+0000 7f50220b8200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.300 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.300 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.10 -s 1 2026-03-24T10:51:35.319 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.311+0000 7fadfc936200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.324 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.324 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.11 -s 1 2026-03-24T10:51:35.344 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.339+0000 7fe6170c1200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.349 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.349 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.12 -s 1 2026-03-24T10:51:35.369 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.363+0000 7f62a9bce200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.374 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.374 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.13 -s 1 2026-03-24T10:51:35.395 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.387+0000 7f52dfd94200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.400 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.400 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.14 -s 1 2026-03-24T10:51:35.419 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.415+0000 7f54f4d8d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.425 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.425 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.15 -s 1 2026-03-24T10:51:35.445 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.439+0000 7f4e277e0200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.450 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.450 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.16 -s 1 2026-03-24T10:51:35.469 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.463+0000 7f04d18cb200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.474 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.474 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.17 -s 1 2026-03-24T10:51:35.493 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.487+0000 7ff4438ce200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.498 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.498 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.18 -s 1 2026-03-24T10:51:35.518 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.511+0000 7f063c3ea200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.523 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.523 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.19 -s 1 2026-03-24T10:51:35.542 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.535+0000 7f1d74c15200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.547 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.547 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.20 -s 1 2026-03-24T10:51:35.568 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.563+0000 7fcf9de3d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.573 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.573 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.21 -s 1 2026-03-24T10:51:35.593 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.587+0000 7f320e6dc200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.599 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.599 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.22 -s 1 2026-03-24T10:51:35.619 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.611+0000 7fd849c7d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.624 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.624 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.23 -s 1 2026-03-24T10:51:35.644 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.639+0000 7f7e583b6200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.649 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.649 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.24 -s 1 2026-03-24T10:51:35.670 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.663+0000 7fce142de200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.675 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.675 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.25 -s 1 2026-03-24T10:51:35.694 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.687+0000 7fb70d788200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.700 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.700 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.26 -s 1 2026-03-24T10:51:35.719 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.711+0000 7f404e80c200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.724 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.724 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.27 -s 1 2026-03-24T10:51:35.744 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.739+0000 7f0303198200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.750 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.750 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.28 -s 1 2026-03-24T10:51:35.769 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.763+0000 7f7d15bee200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.774 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.774 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.29 -s 1 2026-03-24T10:51:35.793 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.787+0000 7ff42f705200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.798 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.798 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.30 -s 1 2026-03-24T10:51:35.818 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.811+0000 7fe9953f3200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.823 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.823 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.31 -s 1 2026-03-24T10:51:35.843 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.835+0000 7fa49dea9200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.848 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.848 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.32 -s 1 2026-03-24T10:51:35.868 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.863+0000 7f06678c2200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.874 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.874 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.33 -s 1 2026-03-24T10:51:35.894 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.887+0000 7efc8aa43200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.898 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.898 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.34 -s 1 2026-03-24T10:51:35.917 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.911+0000 7fe34f255200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.922 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.922 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.35 -s 1 2026-03-24T10:51:35.941 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.935+0000 7f9a4a23e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.947 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.947 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.36 -s 1 2026-03-24T10:51:35.967 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.959+0000 7fd435d8c200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.972 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.972 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.37 -s 1 2026-03-24T10:51:35.991 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:35.983+0000 7f96f17e9200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:35.997 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:35.997 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.38 -s 1 2026-03-24T10:51:36.016 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.011+0000 7fe161113200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.021 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.021 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.39 -s 1 2026-03-24T10:51:36.040 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.035+0000 7f9458b17200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.045 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.045 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.40 -s 1 2026-03-24T10:51:36.064 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.059+0000 7f893bd52200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.069 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.069 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.41 -s 1 2026-03-24T10:51:36.088 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.083+0000 7f731d040200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.094 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.094 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.42 -s 1 2026-03-24T10:51:36.114 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.107+0000 7feafbff7200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.119 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.119 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.43 -s 1 2026-03-24T10:51:36.139 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.131+0000 7faf05533200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.144 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.144 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.44 -s 1 2026-03-24T10:51:36.163 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.155+0000 7f0d2c761200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.167 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.167 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.45 -s 1 2026-03-24T10:51:36.186 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.179+0000 7f1291cdf200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.192 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.192 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.46 -s 1 2026-03-24T10:51:36.211 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.203+0000 7f081ff75200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.215 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.215 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.47 -s 1 2026-03-24T10:51:36.235 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.231+0000 7f4a6fb23200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.241 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.241 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.48 -s 1 2026-03-24T10:51:36.260 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.255+0000 7fc75d61c200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.265 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.265 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.49 -s 1 2026-03-24T10:51:36.284 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.279+0000 7f752acab200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.289 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.289 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.50 -s 1 2026-03-24T10:51:36.308 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.303+0000 7f50351fe200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.314 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.314 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.51 -s 1 2026-03-24T10:51:36.333 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.327+0000 7fe72251e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.338 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.338 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.52 -s 1 2026-03-24T10:51:36.357 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.351+0000 7f7085bd1200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.362 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.362 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.53 -s 1 2026-03-24T10:51:36.381 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.375+0000 7f2594000200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.386 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.386 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.54 -s 1 2026-03-24T10:51:36.407 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.399+0000 7fb2ec393200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.411 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.411 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.55 -s 1 2026-03-24T10:51:36.431 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.427+0000 7f16a1687200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.436 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.436 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.56 -s 1 2026-03-24T10:51:36.456 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.451+0000 7f16c663b200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.462 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.462 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.57 -s 1 2026-03-24T10:51:36.482 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.475+0000 7f14488c0200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.487 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.487 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.58 -s 1 2026-03-24T10:51:36.507 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.499+0000 7f2570351200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.512 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.512 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.59 -s 1 2026-03-24T10:51:36.532 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.527+0000 7fb13a50d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.536 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.536 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.60 -s 1 2026-03-24T10:51:36.556 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.551+0000 7fc8e57e2200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.561 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.561 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.61 -s 1 2026-03-24T10:51:36.582 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.575+0000 7f05b5102200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.587 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.587 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.62 -s 1 2026-03-24T10:51:36.607 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.603+0000 7f793ffed200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.612 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.612 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.63 -s 1 2026-03-24T10:51:36.631 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.623+0000 7f9602372200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.637 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.637 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.64 -s 1 2026-03-24T10:51:36.657 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.651+0000 7fb750158200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.662 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.662 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.65 -s 1 2026-03-24T10:51:36.681 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.675+0000 7f6cc36d8200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.686 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.686 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.66 -s 1 2026-03-24T10:51:36.705 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.699+0000 7f753779e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.710 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.710 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.67 -s 1 2026-03-24T10:51:36.730 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.723+0000 7f04fa779200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.735 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.68 -s 1 2026-03-24T10:51:36.754 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.747+0000 7fc5ce171200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.758 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.758 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.69 -s 1 2026-03-24T10:51:36.777 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.771+0000 7f5bbe26d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.782 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.782 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.70 -s 1 2026-03-24T10:51:36.802 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.795+0000 7fb5f33e9200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.807 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.807 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.71 -s 1 2026-03-24T10:51:36.828 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.823+0000 7f77a3649200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.833 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.833 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.72 -s 1 2026-03-24T10:51:36.853 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.847+0000 7fbb2125a200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.859 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.859 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.73 -s 1 2026-03-24T10:51:36.879 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.871+0000 7f99b0da3200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.884 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.884 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.74 -s 1 2026-03-24T10:51:36.904 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.899+0000 7fed96f56200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.908 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.908 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.75 -s 1 2026-03-24T10:51:36.928 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.923+0000 7f8d66768200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.933 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.933 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.76 -s 1 2026-03-24T10:51:36.954 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.947+0000 7f81af3f4200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.959 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.959 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.77 -s 1 2026-03-24T10:51:36.979 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.971+0000 7faf1623c200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:36.985 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:36.985 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.78 -s 1 2026-03-24T10:51:37.006 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:36.999+0000 7f86ead84200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.012 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.012 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.79 -s 1 2026-03-24T10:51:37.032 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.027+0000 7f8a2bfd9200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.038 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.038 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.80 -s 1 2026-03-24T10:51:37.060 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.055+0000 7f7e958ff200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.066 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.066 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.81 -s 1 2026-03-24T10:51:37.088 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.083+0000 7f4c34759200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.094 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.094 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.82 -s 1 2026-03-24T10:51:37.115 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.111+0000 7fe88f68e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.121 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.121 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.83 -s 1 2026-03-24T10:51:37.140 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.135+0000 7faee7f8f200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.145 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.145 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.84 -s 1 2026-03-24T10:51:37.165 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.159+0000 7fd805355200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.171 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.171 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.85 -s 1 2026-03-24T10:51:37.191 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.183+0000 7f84dfbc8200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.196 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.196 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.86 -s 1 2026-03-24T10:51:37.215 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.207+0000 7faae721d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.220 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.221 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.87 -s 1 2026-03-24T10:51:37.240 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.235+0000 7faa581d5200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.245 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.245 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.88 -s 1 2026-03-24T10:51:37.265 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.259+0000 7f92a357e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.270 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.270 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.89 -s 1 2026-03-24T10:51:37.290 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.283+0000 7f4a1df82200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.295 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.295 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.90 -s 1 2026-03-24T10:51:37.316 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.311+0000 7f21f147f200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.321 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.322 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.91 -s 1 2026-03-24T10:51:37.342 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.335+0000 7f39ab6ff200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.348 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.348 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.92 -s 1 2026-03-24T10:51:37.368 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.363+0000 7f108e211200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.374 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.375 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.93 -s 1 2026-03-24T10:51:37.397 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.391+0000 7fb7f025d200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.402 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.402 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.94 -s 1 2026-03-24T10:51:37.424 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.419+0000 7f1523e75200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.430 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.430 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.95 -s 1 2026-03-24T10:51:37.452 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.447+0000 7f15bea42200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.459 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.459 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.96 -s 1 2026-03-24T10:51:37.478 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.471+0000 7fd3d739e200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.485 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.485 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.97 -s 1 2026-03-24T10:51:37.507 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.499+0000 7f424b2ae200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.513 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.513 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.98 -s 1 2026-03-24T10:51:37.534 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.527+0000 7f31304f1200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.540 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.540 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.99 -s 1 2026-03-24T10:51:37.561 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.555+0000 7faadbe37200 -1 librbd: Forced V1 image creation. 2026-03-24T10:51:37.567 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:51:37.567 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:51:37.567 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 100 2026-03-24T10:51:37.588 INFO:tasks.workunit.client.0.vm05.stdout:100 2026-03-24T10:51:37.588 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:51:37.588 INFO:tasks.workunit.client.0.vm05.stderr:+ grep image 2026-03-24T10:51:37.588 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:51:37.588 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 100 2026-03-24T10:51:37.628 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.623+0000 7f82bcb4e640 0 -- 192.168.123.105:0/2339147336 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5560fe807aa0 msgr2=0x5560fe8485b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:37.632 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:37.627+0000 7f82bcb4e640 0 -- 192.168.123.105:0/2339147336 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f829c05bcf0 msgr2=0x7f829c07c0d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:37.724 INFO:tasks.workunit.client.0.vm05.stdout:100 2026-03-24T10:51:37.724 INFO:tasks.workunit.client.0.vm05.stderr:++ seq -w 00 99 2026-03-24T10:51:37.725 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.725 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.00 2026-03-24T10:51:37.752 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.756 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.756 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.01 2026-03-24T10:51:37.784 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.787 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.787 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.02 2026-03-24T10:51:37.813 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.816 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.816 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.03 2026-03-24T10:51:37.843 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.846 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.846 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.04 2026-03-24T10:51:37.872 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.875 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.875 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.05 2026-03-24T10:51:37.901 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.904 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.904 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.06 2026-03-24T10:51:37.931 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.934 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.934 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.07 2026-03-24T10:51:37.959 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:37.962 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:37.962 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.08 2026-03-24T10:51:38.178 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:38.171+0000 7fb7828fc640 0 --2- 192.168.123.105:0/3026772990 >> v2:192.168.123.105:3300/0 conn(0x558d30c76320 0x558d30c766f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T10:51:38.191 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.195 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.195 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.09 2026-03-24T10:51:38.220 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.223 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.223 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.10 2026-03-24T10:51:38.250 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.252 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.252 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.11 2026-03-24T10:51:38.278 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.281 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.281 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.12 2026-03-24T10:51:38.308 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.311 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.311 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.13 2026-03-24T10:51:38.339 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.342 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.342 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.14 2026-03-24T10:51:38.370 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.373 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.373 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.15 2026-03-24T10:51:38.399 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.401 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.401 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.16 2026-03-24T10:51:38.427 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.431 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.431 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.17 2026-03-24T10:51:38.458 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.461 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.461 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.18 2026-03-24T10:51:38.486 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.489 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.489 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.19 2026-03-24T10:51:38.515 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.517 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.517 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.20 2026-03-24T10:51:38.543 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.546 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.546 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.21 2026-03-24T10:51:38.571 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.574 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.574 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.22 2026-03-24T10:51:38.598 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.601 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.601 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.23 2026-03-24T10:51:38.627 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.630 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.24 2026-03-24T10:51:38.656 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.658 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.658 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.25 2026-03-24T10:51:38.685 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.687 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.688 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.26 2026-03-24T10:51:38.713 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.716 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.716 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.27 2026-03-24T10:51:38.742 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.745 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.745 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.28 2026-03-24T10:51:38.772 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.775 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.775 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.29 2026-03-24T10:51:38.801 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.804 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.804 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.30 2026-03-24T10:51:38.832 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.834 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.834 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.31 2026-03-24T10:51:38.861 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.864 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.864 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.32 2026-03-24T10:51:38.890 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.893 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.893 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.33 2026-03-24T10:51:38.920 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.923 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.923 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.34 2026-03-24T10:51:38.949 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.952 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.952 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.35 2026-03-24T10:51:38.980 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:38.983 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:38.983 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.36 2026-03-24T10:51:39.010 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.013 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.013 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.37 2026-03-24T10:51:39.041 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.045 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.045 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.38 2026-03-24T10:51:39.072 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.075 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.075 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.39 2026-03-24T10:51:39.105 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.109 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.109 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.40 2026-03-24T10:51:39.138 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.141 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.141 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.41 2026-03-24T10:51:39.172 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.176 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.42 2026-03-24T10:51:39.204 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.207 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.207 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.43 2026-03-24T10:51:39.235 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.239 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.239 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.44 2026-03-24T10:51:39.268 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.272 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.272 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.45 2026-03-24T10:51:39.538 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.542 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.542 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.46 2026-03-24T10:51:39.569 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.572 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.572 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.47 2026-03-24T10:51:39.652 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.655 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.48 2026-03-24T10:51:39.810 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.814 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.814 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.49 2026-03-24T10:51:39.841 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.845 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.50 2026-03-24T10:51:39.874 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.878 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.878 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.51 2026-03-24T10:51:39.912 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.916 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.916 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.52 2026-03-24T10:51:39.948 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.952 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.952 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.53 2026-03-24T10:51:39.980 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:39.983 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:39.983 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.54 2026-03-24T10:51:40.010 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.013 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.013 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.55 2026-03-24T10:51:40.039 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.042 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.042 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.56 2026-03-24T10:51:40.070 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.073 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.073 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.57 2026-03-24T10:51:40.101 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.104 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.104 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.58 2026-03-24T10:51:40.133 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.136 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.136 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.59 2026-03-24T10:51:40.165 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.168 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.168 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.60 2026-03-24T10:51:40.198 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.202 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.202 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.61 2026-03-24T10:51:40.232 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.236 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.236 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.62 2026-03-24T10:51:40.263 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.266 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.266 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.63 2026-03-24T10:51:40.293 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.295 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.295 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.64 2026-03-24T10:51:40.321 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.324 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.324 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.65 2026-03-24T10:51:40.349 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.352 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.352 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.66 2026-03-24T10:51:40.379 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.382 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.382 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.67 2026-03-24T10:51:40.408 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.410 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.410 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.68 2026-03-24T10:51:40.436 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.439 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.439 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.69 2026-03-24T10:51:40.466 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.469 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.469 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.70 2026-03-24T10:51:40.495 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.498 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.498 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.71 2026-03-24T10:51:40.524 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.527 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.527 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.72 2026-03-24T10:51:40.554 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.557 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.557 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.73 2026-03-24T10:51:40.598 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.600 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.600 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.74 2026-03-24T10:51:40.627 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.630 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.75 2026-03-24T10:51:40.655 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.658 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.658 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.76 2026-03-24T10:51:40.683 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.686 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.686 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.77 2026-03-24T10:51:40.711 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.714 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.714 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.78 2026-03-24T10:51:40.740 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.743 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.743 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.79 2026-03-24T10:51:40.769 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.774 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.774 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.80 2026-03-24T10:51:40.801 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:40.804 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:40.804 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.81 2026-03-24T10:51:41.022 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:41.015+0000 7f9195f98640 0 --2- 192.168.123.105:0/4157647330 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x555759e82c50 0x555759e726b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T10:51:41.032 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.035 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.035 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.82 2026-03-24T10:51:41.061 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.064 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.83 2026-03-24T10:51:41.089 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.091 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.091 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.84 2026-03-24T10:51:41.117 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.119 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.119 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.85 2026-03-24T10:51:41.145 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.148 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.148 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.86 2026-03-24T10:51:41.174 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.176 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.87 2026-03-24T10:51:41.203 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.205 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.206 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.88 2026-03-24T10:51:41.231 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.233 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.233 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.89 2026-03-24T10:51:41.259 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.261 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.261 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.90 2026-03-24T10:51:41.291 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.295 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.295 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.91 2026-03-24T10:51:41.324 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.327 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.327 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.92 2026-03-24T10:51:41.354 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.357 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.357 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.93 2026-03-24T10:51:41.385 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.388 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.388 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.94 2026-03-24T10:51:41.415 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.419 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.419 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.95 2026-03-24T10:51:41.448 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.451 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.451 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.96 2026-03-24T10:51:41.478 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.481 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.481 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.97 2026-03-24T10:51:41.509 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.512 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.512 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.98 2026-03-24T10:51:41.538 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.541 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.541 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.99 2026-03-24T10:51:41.568 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:41.571 INFO:tasks.workunit.client.0.vm05.stderr:++ seq -w 00 99 2026-03-24T10:51:41.571 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.572 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.00 --image-format 2 -s 1 2026-03-24T10:51:41.601 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.601 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.01 --image-format 2 -s 1 2026-03-24T10:51:41.629 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.629 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.02 --image-format 2 -s 1 2026-03-24T10:51:41.659 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.659 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.03 --image-format 2 -s 1 2026-03-24T10:51:41.689 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.689 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.04 --image-format 2 -s 1 2026-03-24T10:51:41.719 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.719 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.05 --image-format 2 -s 1 2026-03-24T10:51:41.748 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.748 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.06 --image-format 2 -s 1 2026-03-24T10:51:41.777 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.777 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.07 --image-format 2 -s 1 2026-03-24T10:51:41.806 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.807 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.08 --image-format 2 -s 1 2026-03-24T10:51:41.835 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.835 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.09 --image-format 2 -s 1 2026-03-24T10:51:41.865 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.865 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.10 --image-format 2 -s 1 2026-03-24T10:51:41.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.894 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.11 --image-format 2 -s 1 2026-03-24T10:51:41.924 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.924 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.12 --image-format 2 -s 1 2026-03-24T10:51:41.953 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.953 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.13 --image-format 2 -s 1 2026-03-24T10:51:41.982 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:41.982 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.14 --image-format 2 -s 1 2026-03-24T10:51:42.011 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.011 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.15 --image-format 2 -s 1 2026-03-24T10:51:42.040 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.040 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.16 --image-format 2 -s 1 2026-03-24T10:51:42.070 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.070 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.17 --image-format 2 -s 1 2026-03-24T10:51:42.099 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.099 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.18 --image-format 2 -s 1 2026-03-24T10:51:42.128 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.129 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.19 --image-format 2 -s 1 2026-03-24T10:51:42.159 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.159 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.20 --image-format 2 -s 1 2026-03-24T10:51:42.188 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.188 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.21 --image-format 2 -s 1 2026-03-24T10:51:42.218 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.218 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.22 --image-format 2 -s 1 2026-03-24T10:51:42.247 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.247 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.23 --image-format 2 -s 1 2026-03-24T10:51:42.291 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.291 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.24 --image-format 2 -s 1 2026-03-24T10:51:42.320 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.320 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.25 --image-format 2 -s 1 2026-03-24T10:51:42.350 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.350 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.26 --image-format 2 -s 1 2026-03-24T10:51:42.379 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.379 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.27 --image-format 2 -s 1 2026-03-24T10:51:42.409 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.409 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.28 --image-format 2 -s 1 2026-03-24T10:51:42.437 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.437 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.29 --image-format 2 -s 1 2026-03-24T10:51:42.466 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.466 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.30 --image-format 2 -s 1 2026-03-24T10:51:42.496 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.496 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.31 --image-format 2 -s 1 2026-03-24T10:51:42.527 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.527 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.32 --image-format 2 -s 1 2026-03-24T10:51:42.557 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.557 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.33 --image-format 2 -s 1 2026-03-24T10:51:42.588 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.589 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.34 --image-format 2 -s 1 2026-03-24T10:51:42.621 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.35 --image-format 2 -s 1 2026-03-24T10:51:42.653 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.653 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.36 --image-format 2 -s 1 2026-03-24T10:51:42.685 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.685 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.37 --image-format 2 -s 1 2026-03-24T10:51:42.716 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.716 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.38 --image-format 2 -s 1 2026-03-24T10:51:42.748 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.748 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.39 --image-format 2 -s 1 2026-03-24T10:51:42.779 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.779 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.40 --image-format 2 -s 1 2026-03-24T10:51:42.809 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.809 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.41 --image-format 2 -s 1 2026-03-24T10:51:42.839 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.839 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.42 --image-format 2 -s 1 2026-03-24T10:51:42.869 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.869 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.43 --image-format 2 -s 1 2026-03-24T10:51:42.899 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.899 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.44 --image-format 2 -s 1 2026-03-24T10:51:42.929 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.929 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.45 --image-format 2 -s 1 2026-03-24T10:51:42.957 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.957 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.46 --image-format 2 -s 1 2026-03-24T10:51:42.987 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:42.987 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.47 --image-format 2 -s 1 2026-03-24T10:51:43.017 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.017 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.48 --image-format 2 -s 1 2026-03-24T10:51:43.048 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.048 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.49 --image-format 2 -s 1 2026-03-24T10:51:43.078 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.078 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.50 --image-format 2 -s 1 2026-03-24T10:51:43.108 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.108 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.51 --image-format 2 -s 1 2026-03-24T10:51:43.138 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.138 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.52 --image-format 2 -s 1 2026-03-24T10:51:43.168 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.168 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.53 --image-format 2 -s 1 2026-03-24T10:51:43.199 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.199 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.54 --image-format 2 -s 1 2026-03-24T10:51:43.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.227 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.55 --image-format 2 -s 1 2026-03-24T10:51:43.257 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.257 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.56 --image-format 2 -s 1 2026-03-24T10:51:43.286 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.286 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.57 --image-format 2 -s 1 2026-03-24T10:51:43.316 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.316 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.58 --image-format 2 -s 1 2026-03-24T10:51:43.345 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.345 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.59 --image-format 2 -s 1 2026-03-24T10:51:43.374 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.374 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.60 --image-format 2 -s 1 2026-03-24T10:51:43.405 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.405 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.61 --image-format 2 -s 1 2026-03-24T10:51:43.435 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.435 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.62 --image-format 2 -s 1 2026-03-24T10:51:43.465 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.465 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.63 --image-format 2 -s 1 2026-03-24T10:51:43.494 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.494 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.64 --image-format 2 -s 1 2026-03-24T10:51:43.524 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.524 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.65 --image-format 2 -s 1 2026-03-24T10:51:43.554 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.554 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.66 --image-format 2 -s 1 2026-03-24T10:51:43.583 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.583 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.67 --image-format 2 -s 1 2026-03-24T10:51:43.613 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.613 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.68 --image-format 2 -s 1 2026-03-24T10:51:43.642 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.642 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.69 --image-format 2 -s 1 2026-03-24T10:51:43.672 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.672 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.70 --image-format 2 -s 1 2026-03-24T10:51:43.701 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.701 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.71 --image-format 2 -s 1 2026-03-24T10:51:43.732 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.732 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.72 --image-format 2 -s 1 2026-03-24T10:51:43.762 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.762 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.73 --image-format 2 -s 1 2026-03-24T10:51:43.795 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.795 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.74 --image-format 2 -s 1 2026-03-24T10:51:43.825 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.825 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.75 --image-format 2 -s 1 2026-03-24T10:51:43.856 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.856 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.76 --image-format 2 -s 1 2026-03-24T10:51:43.886 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:43.886 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.77 --image-format 2 -s 1 2026-03-24T10:51:44.103 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:44.095+0000 7fb29e47e640 0 --2- 192.168.123.105:0/4233732972 >> v2:192.168.123.105:3300/0 conn(0x563ba34be520 0x563ba34be8f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T10:51:44.119 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.119 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.78 --image-format 2 -s 1 2026-03-24T10:51:44.149 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.149 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.79 --image-format 2 -s 1 2026-03-24T10:51:44.180 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.180 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.80 --image-format 2 -s 1 2026-03-24T10:51:44.210 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.210 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.81 --image-format 2 -s 1 2026-03-24T10:51:44.240 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.240 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.82 --image-format 2 -s 1 2026-03-24T10:51:44.271 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.271 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.83 --image-format 2 -s 1 2026-03-24T10:51:44.336 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.336 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.84 --image-format 2 -s 1 2026-03-24T10:51:44.366 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.366 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.85 --image-format 2 -s 1 2026-03-24T10:51:44.396 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.397 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.86 --image-format 2 -s 1 2026-03-24T10:51:44.426 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.426 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.87 --image-format 2 -s 1 2026-03-24T10:51:44.456 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.456 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.88 --image-format 2 -s 1 2026-03-24T10:51:44.486 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.486 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.89 --image-format 2 -s 1 2026-03-24T10:51:44.515 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.515 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.90 --image-format 2 -s 1 2026-03-24T10:51:44.545 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.545 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.91 --image-format 2 -s 1 2026-03-24T10:51:44.574 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.574 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.92 --image-format 2 -s 1 2026-03-24T10:51:44.605 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.605 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.93 --image-format 2 -s 1 2026-03-24T10:51:44.635 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.635 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.94 --image-format 2 -s 1 2026-03-24T10:51:44.663 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.663 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.95 --image-format 2 -s 1 2026-03-24T10:51:44.693 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.693 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.96 --image-format 2 -s 1 2026-03-24T10:51:44.724 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.724 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.97 --image-format 2 -s 1 2026-03-24T10:51:44.753 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.753 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.98 --image-format 2 -s 1 2026-03-24T10:51:44.785 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:44.785 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create image.99 --image-format 2 -s 1 2026-03-24T10:51:44.821 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T10:51:44.821 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:51:44.821 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 100 2026-03-24T10:51:44.847 INFO:tasks.workunit.client.0.vm05.stdout:100 2026-03-24T10:51:44.847 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T10:51:44.847 INFO:tasks.workunit.client.0.vm05.stderr:+ grep image 2026-03-24T10:51:44.847 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T10:51:44.847 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 100 2026-03-24T10:51:44.883 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:44.875+0000 7f694df67640 0 -- 192.168.123.105:0/4192263334 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f692c05bd40 msgr2=0x7f692c07c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:44.887 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:44.879+0000 7f694df67640 0 -- 192.168.123.105:0/4192263334 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55a0f5185560 msgr2=0x7f692c09ce50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.063 INFO:tasks.workunit.client.0.vm05.stdout:100 2026-03-24T10:51:45.063 INFO:tasks.workunit.client.0.vm05.stderr:++ seq -w 00 99 2026-03-24T10:51:45.064 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.00 2026-03-24T10:51:45.119 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.111+0000 7fe8a1f08640 0 -- 192.168.123.105:0/1310202968 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5578d30e3320 msgr2=0x5578d3118390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.119 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.122 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.122 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.01 2026-03-24T10:51:45.175 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.167+0000 7fd8dc39f640 0 -- 192.168.123.105:0/1591760376 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55b2c25b8820 msgr2=0x55b2c25ed510 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.177 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.181 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.181 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.02 2026-03-24T10:51:45.238 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.231+0000 7fda5d65a640 0 -- 192.168.123.105:0/694014135 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x558d3c7a1dc0 msgr2=0x558d3c8859d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.238 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.242 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.242 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.03 2026-03-24T10:51:45.297 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.291+0000 7f73084f5640 0 -- 192.168.123.105:0/1382056684 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x565126882150 msgr2=0x56512683bda0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.297 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.301 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.301 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.04 2026-03-24T10:51:45.355 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.347+0000 7ffa87116640 0 -- 192.168.123.105:0/851986376 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7ffa6805bd40 msgr2=0x7ffa6807c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.359 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.363 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.363 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.05 2026-03-24T10:51:45.418 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.411+0000 7fdc0880c640 0 -- 192.168.123.105:0/4183750659 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5654f87f0320 msgr2=0x5654f8825390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.421 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.424 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.425 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.06 2026-03-24T10:51:45.483 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.475+0000 7f1374aca640 0 -- 192.168.123.105:0/3799161041 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56215904c720 msgr2=0x56215903c540 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.483 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.486 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.486 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.07 2026-03-24T10:51:45.544 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.539+0000 7f08a2f5c640 0 -- 192.168.123.105:0/2339035657 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x565049fc3320 msgr2=0x565049ff8390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.544 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.547 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.547 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.08 2026-03-24T10:51:45.603 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.595+0000 7fca5cb80640 0 -- 192.168.123.105:0/1197283208 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fca34008d30 msgr2=0x7fca340291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.606 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.609 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.609 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.09 2026-03-24T10:51:45.664 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.659+0000 7fe12737d640 0 -- 192.168.123.105:0/298272144 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55dae6b32320 msgr2=0x55dae6b10790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.664 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.667 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.667 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.10 2026-03-24T10:51:45.719 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.711+0000 7f6defcf2640 0 -- 192.168.123.105:0/2798707855 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55f7a5687320 msgr2=0x55f7a5665800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.719 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.722 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.722 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.11 2026-03-24T10:51:45.778 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.771+0000 7f1e2f45d640 0 -- 192.168.123.105:0/1953513699 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55c2a00f3150 msgr2=0x55c2a00acda0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.778 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.781 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.781 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.12 2026-03-24T10:51:45.837 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.831+0000 7fc946fe8640 0 -- 192.168.123.105:0/4207578573 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x56479a6a1b20 msgr2=0x56479a6c1fa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:45.839 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.842 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.842 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.13 2026-03-24T10:51:45.905 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:45.909 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:45.909 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.14 2026-03-24T10:51:45.967 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:45.959+0000 7ff518347640 0 -- 192.168.123.105:0/15098856 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55a4c43e8150 msgr2=0x55a4c43a1ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.171 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.175 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.175 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.15 2026-03-24T10:51:46.230 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.223+0000 7f3ef8d32640 0 -- 192.168.123.105:0/266911535 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x558842bf7320 msgr2=0x558842c2c390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:46.233 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.237 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.237 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.16 2026-03-24T10:51:46.289 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.283+0000 7fcd3d8a3640 0 -- 192.168.123.105:0/2408161793 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fcd20012e70 msgr2=0x7fcd200132e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:46.292 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.295 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.295 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.17 2026-03-24T10:51:46.348 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.343+0000 7f9983287640 0 -- 192.168.123.105:0/4100937265 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f996005be10 msgr2=0x7f996007c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.353 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.356 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.356 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.18 2026-03-24T10:51:46.409 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.403+0000 7fdc903b9640 0 -- 192.168.123.105:0/1016726231 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x564fea80a360 msgr2=0x564fea83be70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.412 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.415 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.415 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.19 2026-03-24T10:51:46.474 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.467+0000 7fc947fff640 0 -- 192.168.123.105:0/2462485701 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x555c08cfdfe0 msgr2=0x555c08d1e460 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.477 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.481 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.481 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.20 2026-03-24T10:51:46.535 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.531+0000 7fb61de05640 0 -- 192.168.123.105:0/3936805086 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x563e539c4150 msgr2=0x563e5397dd50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.536 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.539 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.539 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.21 2026-03-24T10:51:46.599 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.591+0000 7f6d641a2640 0 -- 192.168.123.105:0/1487423442 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55b9a8036320 msgr2=0x55b9a8014800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:46.601 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.604 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.604 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.22 2026-03-24T10:51:46.657 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.651+0000 7f2a46b8c640 0 -- 192.168.123.105:0/3074435799 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f2a2805bd40 msgr2=0x7f2a2807c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.660 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.664 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.664 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.23 2026-03-24T10:51:46.718 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.721 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.721 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.24 2026-03-24T10:51:46.772 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.767+0000 7f6ff84ba640 0 -- 192.168.123.105:0/2263579446 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f6fd805bd40 msgr2=0x7f6fd807c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.777 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.780 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.780 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.25 2026-03-24T10:51:46.830 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:46.823+0000 7fe30ce33640 0 -- 192.168.123.105:0/1158256262 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x557bddf70150 msgr2=0x557bddf61fa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:46.835 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:46.838 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:46.838 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.26 2026-03-24T10:51:47.054 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:47.047+0000 7fc62cf35640 0 -- 192.168.123.105:0/881116251 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55d4742d3320 msgr2=0x55d4742b1800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T10:51:47.056 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:47.060 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:47.060 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.27 2026-03-24T10:51:47.117 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:47.120 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:47.120 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.28 2026-03-24T10:51:47.175 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:47.167+0000 7fe2b085f640 0 -- 192.168.123.105:0/577402028 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7fe29005bcf0 msgr2=0x7fe29007c0d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:47.178 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:47.181 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:47.181 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.29 2026-03-24T10:51:47.239 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:47.231+0000 7efddc10b640 0 -- 192.168.123.105:0/2365271419 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x556b4069c150 msgr2=0x556b4067f3c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:47.239 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:47.243 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:47.243 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.30 2026-03-24T10:51:47.298 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:47.291+0000 7f777b1e6640 0 -- 192.168.123.105:0/2018369429 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x563d2d6ef0a0 msgr2=0x563d2d70f520 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:51:47.300 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T10:51:47.304 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T10:51:47.304 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.31 2026-03-24T10:51:47.360 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:51:47.355+0000 7f7f81a2e640 0 -- 192.168.123.105:0/3192967245 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f7f6005be10 msgr2=0x7f7f6007c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T10:54:47.334 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T10:54:47.328+0000 7f7f8222f640 0 -- 192.168.123.105:0/3192967245 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x5560081d9c50 msgr2=0x5560081c9800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:06:47.362 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:06:47.366 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:06:47.366 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.32 2026-03-24T11:06:47.430 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:06:47.427+0000 7fb25f73e640 0 -- 192.168.123.105:0/3093835610 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560c2b3cfdc0 msgr2=0x560c2b4b39d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:06:47.430 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:06:47.434 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:06:47.434 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.33 2026-03-24T11:06:47.509 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:06:47.507+0000 7fe6aa4d7640 0 -- 192.168.123.105:0/39835940 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fe68805be10 msgr2=0x7fe68807c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:06:47.515 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:06:47.519 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:06:47.519 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.34 2026-03-24T11:06:47.588 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:06:47.592 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:06:47.592 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.35 2026-03-24T11:06:47.682 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:06:47.679+0000 7fa2837fe640 0 -- 192.168.123.105:0/2408305678 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fa26405bd40 msgr2=0x7fa26407c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:09:47.638 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:09:47.640+0000 7fa283fff640 0 -- 192.168.123.105:0/2408305678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x556f84f2cc50 msgr2=0x556f84f21740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:47.686 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.689 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.689 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.36 2026-03-24T11:21:47.745 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:47.749+0000 7fdbea298640 0 -- 192.168.123.105:0/2913123332 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560f5c19c320 msgr2=0x560f5c17a800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:47.745 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.748 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.748 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.37 2026-03-24T11:21:47.801 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:47.805+0000 7fb39cb59640 0 -- 192.168.123.105:0/1288413483 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5564737e9150 msgr2=0x5564737d8f80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:47.803 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.806 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.806 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.38 2026-03-24T11:21:47.864 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.867 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.867 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.39 2026-03-24T11:21:47.925 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:47.929+0000 7fe66cb42640 0 -- 192.168.123.105:0/1714684083 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x562bdbf4a320 msgr2=0x562bdbf28800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:47.926 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.930 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.930 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.40 2026-03-24T11:21:47.988 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:47.989+0000 7f40e7425640 0 -- 192.168.123.105:0/816969283 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55a4a97bc320 msgr2=0x55a4a979a800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:47.990 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:47.993 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:47.993 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.41 2026-03-24T11:21:48.050 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.053+0000 7f94755aa640 0 -- 192.168.123.105:0/1270804108 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f945405be10 msgr2=0x7f945407c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.053 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.056 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.056 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.42 2026-03-24T11:21:48.113 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.117+0000 7f5077716640 0 -- 192.168.123.105:0/2816176038 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55a32e75c320 msgr2=0x55a32e73a800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.116 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.119 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.119 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.43 2026-03-24T11:21:48.177 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.181+0000 7f67fa261640 0 -- 192.168.123.105:0/3199685081 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55f8e54c7320 msgr2=0x55f8e54fc390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.180 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.183 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.183 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.44 2026-03-24T11:21:48.239 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.241+0000 7f14f0676640 0 -- 192.168.123.105:0/1071132646 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55ad3b96e320 msgr2=0x55ad3b94c800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.241 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.244 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.244 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.45 2026-03-24T11:21:48.300 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.301+0000 7f65bc87a640 0 -- 192.168.123.105:0/1019150030 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56164ca14320 msgr2=0x56164c9f2800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.302 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.305 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.305 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.46 2026-03-24T11:21:48.359 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.361+0000 7f7762b14640 0 -- 192.168.123.105:0/3892256327 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f774005be10 msgr2=0x7f774007c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.365 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.368 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.368 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.47 2026-03-24T11:21:48.422 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.425+0000 7f42f48d6640 0 -- 192.168.123.105:0/388226368 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56143e3365b0 msgr2=0x56143e3264b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.424 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.427 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.427 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.48 2026-03-24T11:21:48.481 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.485+0000 7f6db37fe640 0 -- 192.168.123.105:0/4160085140 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f6d9403ff10 msgr2=0x7f6d940419c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.484 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.487 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.487 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.49 2026-03-24T11:21:48.542 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.545+0000 7f6e3673b640 0 -- 192.168.123.105:0/2232396675 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f6e10012e40 msgr2=0x7f6e100132b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.544 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.548 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.548 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.50 2026-03-24T11:21:48.603 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.605+0000 7f72bd8ff640 0 -- 192.168.123.105:0/1406402980 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f729c05bd40 msgr2=0x7f729c07c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.605 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.609 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.609 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.51 2026-03-24T11:21:48.662 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.665+0000 7f836bfff640 0 -- 192.168.123.105:0/95525985 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f8348008d30 msgr2=0x7f83480291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.664 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.667 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.667 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.52 2026-03-24T11:21:48.725 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.729+0000 7fe8b8916640 0 -- 192.168.123.105:0/151265637 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5625b70dd320 msgr2=0x5625b7112390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.729 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.732 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.732 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.53 2026-03-24T11:21:48.791 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.793+0000 7f01fe5e3640 0 -- 192.168.123.105:0/3530081519 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55c5d8133320 msgr2=0x55c5d8111800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.792 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.796 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.796 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.54 2026-03-24T11:21:48.853 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.857+0000 7f15de8f0640 0 -- 192.168.123.105:0/3298602173 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55b416249320 msgr2=0x55b416227800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.853 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.856 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.856 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.55 2026-03-24T11:21:48.911 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.913+0000 7f5424a17640 0 -- 192.168.123.105:0/3848716152 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f53fc012e20 msgr2=0x7f53fc013290 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.913 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.916 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.916 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.56 2026-03-24T11:21:48.972 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:48.973+0000 7f14777fe640 0 -- 192.168.123.105:0/2708114073 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f145c008d30 msgr2=0x7f145c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:48.973 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:48.977 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:48.977 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.57 2026-03-24T11:21:49.031 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.033+0000 7fd748fbc640 0 -- 192.168.123.105:0/4114295528 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55f2b617e150 msgr2=0x55f2b616df80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.035 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.038 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.038 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.58 2026-03-24T11:21:49.093 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.097+0000 7fb09a294640 0 -- 192.168.123.105:0/1417012468 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x564558147320 msgr2=0x564558125800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.095 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.098 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.098 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.59 2026-03-24T11:21:49.358 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.361+0000 7efe0ce50640 0 -- 192.168.123.105:0/2644948857 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7efde800a430 msgr2=0x7efde804d9a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.565 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.569 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.569 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.60 2026-03-24T11:21:49.625 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.629+0000 7fb8b0095640 0 -- 192.168.123.105:0/1316184502 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x556b93009360 msgr2=0x556b9303ad60 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.627 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.630 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.61 2026-03-24T11:21:49.682 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.685+0000 7f584a1fe640 0 -- 192.168.123.105:0/273648770 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x561f9eabc320 msgr2=0x561f9eaf1390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:21:49.684 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.688 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.688 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.62 2026-03-24T11:21:49.744 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.745+0000 7f7107242640 0 -- 192.168.123.105:0/1719921905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x557b37813320 msgr2=0x557b37848390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.744 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.747 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.747 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.63 2026-03-24T11:21:49.802 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.805 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.805 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.64 2026-03-24T11:21:49.860 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.861+0000 7fda91cf3640 0 -- 192.168.123.105:0/3934345131 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x555bde441150 msgr2=0x555bde401590 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.863 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.866 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.65 2026-03-24T11:21:49.921 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.925 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.925 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.66 2026-03-24T11:21:49.978 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:49.981+0000 7fe31520d640 0 -- 192.168.123.105:0/966333560 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x563bbac4e320 msgr2=0x563bbac83390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:49.981 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:49.984 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:49.984 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.67 2026-03-24T11:21:50.037 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.041+0000 7f178afdd640 0 -- 192.168.123.105:0/311190980 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55c57f7a5320 msgr2=0x55c57f7da390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.040 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.043 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.043 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.68 2026-03-24T11:21:50.100 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.105+0000 7f3506deb640 0 -- 192.168.123.105:0/2176883094 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5653fd2cd320 msgr2=0x5653fd302390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.100 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.104 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.104 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.69 2026-03-24T11:21:50.160 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.161+0000 7f6bab7da640 0 -- 192.168.123.105:0/2985207919 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55ac4c923360 msgr2=0x55ac4c956fc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.160 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.163 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.163 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.70 2026-03-24T11:21:50.236 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.237+0000 7f2ead524640 0 -- 192.168.123.105:0/744967750 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55753d646320 msgr2=0x55753d67b390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:21:50.239 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.243 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.243 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.71 2026-03-24T11:21:50.299 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.301+0000 7f41b5f49640 0 -- 192.168.123.105:0/155257104 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f419405bd40 msgr2=0x7f419407c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.304 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.308 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.308 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.72 2026-03-24T11:21:50.362 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.365+0000 7faf7ad3f640 0 -- 192.168.123.105:0/3410735191 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7faf54006ed0 msgr2=0x7faf5400a970 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.365 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.368 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.368 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.73 2026-03-24T11:21:50.423 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.425+0000 7f4e6081e640 0 -- 192.168.123.105:0/2869901169 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55ceb9a25320 msgr2=0x55ceb9a5a390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.426 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.429 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.430 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.74 2026-03-24T11:21:50.482 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.485+0000 7f663f315640 0 -- 192.168.123.105:0/3102338745 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f662005bd40 msgr2=0x7f662007c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.487 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.490 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.490 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.75 2026-03-24T11:21:50.543 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.545+0000 7fd8eebdc640 0 -- 192.168.123.105:0/932335 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fd8c8046350 msgr2=0x7fd8c80495c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.546 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.549 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.549 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.76 2026-03-24T11:21:50.602 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.605+0000 7fa51a478640 0 -- 192.168.123.105:0/641275370 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fa4f8003690 msgr2=0x7fa4f8042fe0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.604 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.608 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.608 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.77 2026-03-24T11:21:50.665 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.669+0000 7fe6b84c2640 0 -- 192.168.123.105:0/3164311216 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5569d2e365b0 msgr2=0x5569d2e264b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.666 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.669 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.669 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.78 2026-03-24T11:21:50.725 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.729+0000 7fc88d2e9640 0 -- 192.168.123.105:0/2845122113 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fc870012f60 msgr2=0x7fc8700133d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:21:50.727 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.730 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.730 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.79 2026-03-24T11:21:50.786 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:21:50.790 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:21:50.790 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.80 2026-03-24T11:21:50.843 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:21:50.845+0000 7f2ca5f19640 0 -- 192.168.123.105:0/4236076273 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f2c88008d30 msgr2=0x7f2c880291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:24:50.824 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:24:50.825+0000 7f2ca671a640 0 -- 192.168.123.105:0/4236076273 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x556a448f1c50 msgr2=0x556a448e6600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:50.850 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:50.853 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:50.853 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.81 2026-03-24T11:36:50.910 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:50.913+0000 7f89cdfd1640 0 -- 192.168.123.105:0/3143624474 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x559447cee150 msgr2=0x559447cddf80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:50.911 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:50.915 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:50.915 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.82 2026-03-24T11:36:50.969 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:50.969+0000 7f537ccce640 0 -- 192.168.123.105:0/1235381179 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f535c05bd40 msgr2=0x7f535c07c120 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:50.973 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:50.976 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:50.976 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.83 2026-03-24T11:36:51.031 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.033+0000 7feac1114640 0 -- 192.168.123.105:0/433861478 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fea98006cb0 msgr2=0x7fea98027130 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:36:51.034 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.037 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.037 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.84 2026-03-24T11:36:51.096 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.099 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.099 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.85 2026-03-24T11:36:51.153 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.157+0000 7f8de7fff640 0 -- 192.168.123.105:0/1258874027 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f8dc4008d30 msgr2=0x7f8dc40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.156 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.159 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.159 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.86 2026-03-24T11:36:51.216 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.217+0000 7fe5bf28d640 0 -- 192.168.123.105:0/3412858625 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7fe5ac0023a0 msgr2=0x55e293b09da0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.423 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.426 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.427 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.87 2026-03-24T11:36:51.488 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.489+0000 7f8203fff640 0 -- 192.168.123.105:0/2608776636 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f81e0008d30 msgr2=0x7f81e00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.492 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.495 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.496 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.88 2026-03-24T11:36:51.550 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.553+0000 7f1cf6f29640 0 -- 192.168.123.105:0/3966069388 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55abe2fb3320 msgr2=0x55abe2fe8390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:36:51.553 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.557 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.557 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.89 2026-03-24T11:36:51.613 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.613+0000 7f3df7d27640 0 -- 192.168.123.105:0/3814821710 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x5575490a1d60 msgr2=0x5575490c21e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.615 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.618 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.618 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.90 2026-03-24T11:36:51.673 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.673+0000 7fb30b5a6640 0 -- 192.168.123.105:0/2562864998 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55d6a18ac150 msgr2=0x55d6a1865da0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.673 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.676 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.676 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.91 2026-03-24T11:36:51.736 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.737+0000 7efed4d70640 0 -- 192.168.123.105:0/2017328079 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55d4a15c3320 msgr2=0x55d4a15f8390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.736 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.740 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.740 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.92 2026-03-24T11:36:51.799 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.801+0000 7fc9fe5ed640 0 -- 192.168.123.105:0/2896802556 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fc9d8012830 msgr2=0x7fc9d8012ca0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.800 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.804 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.804 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.93 2026-03-24T11:36:51.858 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.861+0000 7f638075d640 0 -- 192.168.123.105:0/2294403414 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560e674df320 msgr2=0x560e67514390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:36:51.862 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.865 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.865 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.94 2026-03-24T11:36:51.921 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.925+0000 7f1c2a1e7640 0 -- 192.168.123.105:0/1834759015 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x558f694d4320 msgr2=0x558f69509390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.922 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.925 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.925 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.95 2026-03-24T11:36:51.982 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:51.985+0000 7f50b2876640 0 -- 192.168.123.105:0/2972123686 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x563b18455320 msgr2=0x563b1848a8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:51.984 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:51.987 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:51.987 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.96 2026-03-24T11:36:52.043 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:52.045+0000 7f4eb339b640 0 -- 192.168.123.105:0/1151318978 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f4e8c00b8f0 msgr2=0x7f4e8c004c70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:52.046 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:52.049 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:52.049 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.97 2026-03-24T11:36:52.104 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:52.105+0000 7fe69597a640 0 -- 192.168.123.105:0/2133471286 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fe678008d30 msgr2=0x7fe6780291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:52.106 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:52.109 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:52.109 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.98 2026-03-24T11:36:52.162 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:52.165+0000 7f9e57fff640 0 -- 192.168.123.105:0/4079682343 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f9e34012e10 msgr2=0x7f9e34013280 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:52.163 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:52.166 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in $(seq -w 00 99) 2026-03-24T11:36:52.166 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm image.99 2026-03-24T11:36:52.221 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:52.221+0000 7f1ac04e4640 0 -- 192.168.123.105:0/1087895432 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f1aa005be10 msgr2=0x7f1aa007c1f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:52.223 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:52.226 INFO:tasks.workunit.client.0.vm05.stdout:testing remove... 2026-03-24T11:36:52.227 INFO:tasks.workunit.client.0.vm05.stderr:+ test_remove 2026-03-24T11:36:52.227 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing remove...' 2026-03-24T11:36:52.227 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:36:52.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.280 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.332 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.384 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.436 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.490 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.542 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.595 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.647 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.699 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.753 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.806 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.861 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.916 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:52.970 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:53.022 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:53.076 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:53.130 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:53.182 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove NOT_EXIST 2026-03-24T11:36:53.212 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:36:53.212 INFO:tasks.workunit.client.0.vm05.stderr:rbd: delete error: (2) No such file or directory 2026-03-24T11:36:53.216 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:36:53.216 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T11:36:53.230 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:36:53.236 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.237+0000 7f2c81e96200 -1 librbd: Forced V1 image creation. 2026-03-24T11:36:53.242 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:36:53.268 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.271 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.271 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.271 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.291 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.292 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:36:53.321 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:36:53.376 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.377+0000 7f412b7fe640 0 -- 192.168.123.105:0/2488935937 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f410c006fe0 msgr2=0x7f410c0273c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:53.378 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.382 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.382 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.382 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.403 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.403 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 1 test1 2026-03-24T11:36:53.417 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:36:53.424 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.425+0000 7fa1fd818200 -1 librbd: Forced V1 image creation. 2026-03-24T11:36:53.430 INFO:tasks.workunit.client.0.vm05.stderr:+ rados rm -p rbd test1.rbd 2026-03-24T11:36:53.452 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:36:53.474 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.476 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.476 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.476 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.497 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.497 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 -eq 0 ']' 2026-03-24T11:36:53.497 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:36:53.529 INFO:tasks.workunit.client.0.vm05.stderr:++ rados -p rbd ls 2026-03-24T11:36:53.529 INFO:tasks.workunit.client.0.vm05.stderr:++ grep '^rbd_header' 2026-03-24T11:36:53.552 INFO:tasks.workunit.client.0.vm05.stderr:+ HEADER=rbd_header.189f595d14dd 2026-03-24T11:36:53.552 INFO:tasks.workunit.client.0.vm05.stderr:+ rados -p rbd rm rbd_header.189f595d14dd 2026-03-24T11:36:53.573 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:36:53.594 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.597+0000 7f0b19f1a640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.595 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.597+0000 7f0b19f1a640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.602 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.605+0000 7f0b19f1a640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.608 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.611 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.611 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.611 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.633 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.633 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:36:53.662 INFO:tasks.workunit.client.0.vm05.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-24T11:36:53.682 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:36:53.736 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.737+0000 7f30ac4ca640 0 -- 192.168.123.105:0/2768526998 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55778afd2320 msgr2=0x55778afb0800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:53.737 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.741 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.741 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.741 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.762 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.762 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:36:53.791 INFO:tasks.workunit.client.0.vm05.stderr:++ rados -p rbd ls 2026-03-24T11:36:53.791 INFO:tasks.workunit.client.0.vm05.stderr:++ grep '^rbd_header' 2026-03-24T11:36:53.814 INFO:tasks.workunit.client.0.vm05.stderr:+ HEADER=rbd_header.18bd9e2a7bb5 2026-03-24T11:36:53.814 INFO:tasks.workunit.client.0.vm05.stderr:+ rados -p rbd rm rbd_header.18bd9e2a7bb5 2026-03-24T11:36:53.835 INFO:tasks.workunit.client.0.vm05.stderr:+ rados -p rbd rm rbd_id.test2 2026-03-24T11:36:53.856 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:36:53.877 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.881+0000 7f710e7fc640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.879 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.881+0000 7f710effd640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.887 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:53.889+0000 7f710e7fc640 -1 librbd::image::OpenRequest: failed to retrieve initial metadata: (2) No such file or directory 2026-03-24T11:36:53.892 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:53.895 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:53.895 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:53.895 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:53.916 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:53.916 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:36:53.947 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test2@snap 2026-03-24T11:36:54.459 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:36:54.465 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect test2@snap 2026-03-24T11:36:54.495 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test2@snap clone --rbd-default-clone-format 1 2026-03-24T11:36:54.534 INFO:tasks.workunit.client.0.vm05.stderr:+ rados -p rbd rm rbd_children 2026-03-24T11:36:54.556 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm clone 2026-03-24T11:36:54.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:54.613+0000 7f75e2752640 0 -- 192.168.123.105:0/275392137 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55c4598d1360 msgr2=0x55c459902fa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:36:54.615 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:54.618 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:36:54.618 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone 2026-03-24T11:36:54.618 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:36:54.618 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:36:54.639 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:36:54.639 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect test2@snap 2026-03-24T11:36:54.670 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm test2@snap 2026-03-24T11:36:55.463 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:36:55.470 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:36:55.527 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:55.529+0000 7f2434b31640 0 -- 192.168.123.105:0/2547298427 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55aab750c150 msgr2=0x55aab74fbf80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:36:55.532 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:36:55.536 INFO:tasks.workunit.client.0.vm05.stdout:testing migration... 2026-03-24T11:36:55.536 INFO:tasks.workunit.client.0.vm05.stderr:+ test_migration 2026-03-24T11:36:55.536 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing migration...' 2026-03-24T11:36:55.536 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:36:55.536 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:55.795 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:55.849 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:55.903 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:55.960 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.015 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.069 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.121 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.230 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.520 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.575 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.682 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.737 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.791 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:56.845 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:57.101 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:36:57.157 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T11:36:57.564 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T11:36:57.577 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T11:36:57.795 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:36:57.801+0000 7f38ceebb640 0 --2- 192.168.123.105:0/371123652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x5646a1d160a0 0x5646a1d284f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:37:00.539 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 1 -s 128M test1 2026-03-24T11:37:00.555 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:00.561 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:00.565+0000 7fc48376b200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:00.568 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:00.568 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'format: 1' 2026-03-24T11:37:00.591 INFO:tasks.workunit.client.0.vm05.stdout: format: 1 2026-03-24T11:37:00.591 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 --image-format 2 2026-03-24T11:37:00.637 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state test1 2026-03-24T11:37:00.637 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=test1 2026-03-24T11:37:00.637 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status test1 2026-03-24T11:37:00.637 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:00.681 INFO:tasks.workunit.client.0.vm05.stderr:+ test prepared = prepared 2026-03-24T11:37:00.681 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:00.681 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'format: 2' 2026-03-24T11:37:00.711 INFO:tasks.workunit.client.0.vm05.stdout: format: 2 2026-03-24T11:37:00.711 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:00.737 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:00.741+0000 7f1b95809200 -1 librbd::image::PreRemoveRequest: 0x55b4c287e220 validate_image_removal: image in migration state - not removing 2026-03-24T11:37:00.739 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:37:00.739 INFO:tasks.workunit.client.0.vm05.stderr:rbd: error: image still has watchers 2026-03-24T11:37:00.739 INFO:tasks.workunit.client.0.vm05.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-24T11:37:00.742 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:00.742 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:00.803 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:00.805+0000 7f81e3d32640 0 -- 192.168.123.105:0/1603956172 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x557151608b60 msgr2=0x55715169b050 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:00.806 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:00.809+0000 7f81e3d32640 0 -- 192.168.123.105:0/1603956172 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f81c405c4a0 msgr2=0x7f81c407c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:00.811 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:00.815 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state test1 2026-03-24T11:37:00.815 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=test1 2026-03-24T11:37:00.815 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status test1 2026-03-24T11:37:00.815 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:00.858 INFO:tasks.workunit.client.0.vm05.stderr:+ test executed = executed 2026-03-24T11:37:00.858 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:00.907 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete...2026-03-24T11:37:00.909+0000 7f569ffff640 0 -- 192.168.123.105:0/967427063 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f567c01a090 msgr2=0x7f567c01b2d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:01.111 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete...2026-03-24T11:37:01.113+0000 7f569ffff640 0 -- 192.168.123.105:0/967427063 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f567c01a090 msgr2=0x7f567c01b2d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.118 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T11:37:01.123 INFO:tasks.workunit.client.0.vm05.stderr:+ get_migration_state test1 2026-03-24T11:37:01.123 INFO:tasks.workunit.client.0.vm05.stderr:+ local image=test1 2026-03-24T11:37:01.123 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --format xml status test1 2026-03-24T11:37:01.123 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'features: .*layering' 2026-03-24T11:37:01.173 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:01.173 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 --image-feature layering,exclusive-lock,object-map,fast-diff,deep-flatten 2026-03-24T11:37:01.231 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:01.231 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'features: .*layering' 2026-03-24T11:37:01.259 INFO:tasks.workunit.client.0.vm05.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten, migrating 2026-03-24T11:37:01.260 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:01.302 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:01.305+0000 7fa48f958640 0 -- 192.168.123.105:0/1960164238 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x556125828b60 msgr2=0x5561259675d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.307 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:01.309+0000 7fa48f958640 0 -- 192.168.123.105:0/1960164238 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fa46c05c770 msgr2=0x7fa46c07cb70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.311 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:01.314 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:01.364 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:01.365+0000 7f651fa63640 0 -- 192.168.123.105:0/4139378278 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x555c180c8160 msgr2=0x555c1820a350 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:01.370 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete...2026-03-24T11:37:01.373+0000 7f651dfd9640 0 -- 192.168.123.105:0/4139378278 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f650001cc50 msgr2=0x7f650001d0c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.382 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T11:37:01.386 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 rbd2/test1 2026-03-24T11:37:01.442 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:01.445+0000 7f3d579bd640 0 -- 192.168.123.105:0/2921834216 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f3d34004a30 msgr2=0x7f3d34024e10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.447 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state rbd2/test1 2026-03-24T11:37:01.447 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=rbd2/test1 2026-03-24T11:37:01.447 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status rbd2/test1 2026-03-24T11:37:01.447 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:01.494 INFO:tasks.workunit.client.0.vm05.stderr:+ test prepared = prepared 2026-03-24T11:37:01.494 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:37:01.494 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:37:01.494 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:37:01.515 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:37:01.515 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p rbd2 ls 2026-03-24T11:37:01.515 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:37:01.588 INFO:tasks.workunit.client.0.vm05.stdout:test1 2026-03-24T11:37:01.588 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:01.637 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:01.641+0000 7f942a6c8640 0 -- 192.168.123.105:0/561880835 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f9400004cf0 msgr2=0x7f94000250d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.641 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:01.645 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state rbd2/test1 2026-03-24T11:37:01.645 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=rbd2/test1 2026-03-24T11:37:01.646 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status rbd2/test1 2026-03-24T11:37:01.646 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:01.687 INFO:tasks.workunit.client.0.vm05.stderr:+ test executed = executed 2026-03-24T11:37:01.687 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd2/test1 2026-03-24T11:37:01.714 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:01.717+0000 7f94727fc640 -1 librbd::image::PreRemoveRequest: 0x55bd9346c510 validate_image_removal: image in migration state - not removing 2026-03-24T11:37:01.717 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:37:01.717 INFO:tasks.workunit.client.0.vm05.stderr:rbd: error: image still has watchers 2026-03-24T11:37:01.717 INFO:tasks.workunit.client.0.vm05.stderr:This means the image is still open or the client using it crashed. Try again after closing/unmapping it or waiting 30s for the crashed client to timeout. 2026-03-24T11:37:01.720 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:01.720 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:01.776 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-24T11:37:01.777+0000 7f0d4bacd640 0 -- 192.168.123.105:0/2111432858 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5584dee22580 msgr2=0x5584deeb54f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:01.782 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 100% complete...done. 2026-03-24T11:37:01.786 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T11:37:01.811 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd2/ns2 2026-03-24T11:37:01.836 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare rbd2/test1 rbd2/ns1/test1 2026-03-24T11:37:01.892 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:01.893+0000 7f93dc9dc640 0 -- 192.168.123.105:0/2895341071 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x55f4f07ef450 msgr2=0x55f4f0824b20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.896 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-24T11:37:01.896 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=rbd2/ns1/test1 2026-03-24T11:37:01.897 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-24T11:37:01.897 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:01.943 INFO:tasks.workunit.client.0.vm05.stderr:+ test prepared = prepared 2026-03-24T11:37:01.943 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute rbd2/test1 2026-03-24T11:37:01.991 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:01.993+0000 7f527e130640 0 -- 192.168.123.105:0/2252384258 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x557a843a3450 msgr2=0x557a843c3830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:01.994 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:01.997 INFO:tasks.workunit.client.0.vm05.stderr:++ get_migration_state rbd2/ns1/test1 2026-03-24T11:37:01.997 INFO:tasks.workunit.client.0.vm05.stderr:++ local image=rbd2/ns1/test1 2026-03-24T11:37:01.998 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd --format xml status rbd2/ns1/test1 2026-03-24T11:37:01.998 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //status/migration/state 2026-03-24T11:37:02.040 INFO:tasks.workunit.client.0.vm05.stderr:+ test executed = executed 2026-03-24T11:37:02.041 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit rbd2/test1 2026-03-24T11:37:02.091 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete...2026-03-24T11:37:02.093+0000 7f828e1a3640 0 -- 192.168.123.105:0/3892668867 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x55df0d2d9160 msgr2=0x55df0d41ad80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.302 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 100% complete...done. 2026-03-24T11:37:02.306 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare rbd2/ns1/test1 rbd2/ns2/test1 2026-03-24T11:37:02.360 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.361+0000 7f985eb04640 0 -- 192.168.123.105:0/1191473740 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x5557225d1ab0 msgr2=0x5557225f1e90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.367 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute rbd2/ns2/test1 2026-03-24T11:37:02.408 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:02.409+0000 7fcf5baec640 0 -- 192.168.123.105:0/1674405330 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x56282b806490 msgr2=0x56282b826870 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.414 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.417+0000 7fcf5baec640 0 -- 192.168.123.105:0/1674405330 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7fcf3c05c3d0 msgr2=0x7fcf3c07c7d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.417 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:02.421 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit rbd2/ns2/test1 2026-03-24T11:37:02.472 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.473+0000 7fdeed3eb640 0 -- 192.168.123.105:0/2847100793 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x55d524f0ca70 msgr2=0x55d524f2ce50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.477 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.481+0000 7fdeecbea640 0 -- 192.168.123.105:0/2847100793 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fdec800caa0 msgr2=0x7fdec800cf10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:02.692 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete... Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete... Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T11:37:02.696 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M test1 2026-03-24T11:37:02.717 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.717+0000 7f7f273e2200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:02.725 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 --data-pool rbd2 2026-03-24T11:37:02.770 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:02.770 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'data_pool: rbd2' 2026-03-24T11:37:02.797 INFO:tasks.workunit.client.0.vm05.stdout: data_pool: rbd2 2026-03-24T11:37:02.797 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:02.849 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:02.849+0000 7f876a351640 0 -- 192.168.123.105:0/3242536956 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55d0988bbe40 msgr2=0x55d0989fcf20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:02.858 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:02.861+0000 7f876a351640 0 -- 192.168.123.105:0/3242536956 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f874805c4a0 msgr2=0x7f874807c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.862 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:02.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:02.920 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 3% complete... Commit image migration: 6% complete... Commit image migration: 9% complete... Commit image migration: 12% complete... Commit image migration: 15% complete... Commit image migration: 18% complete... Commit image migration: 21% complete... Commit image migration: 25% complete... Commit image migration: 28% complete...2026-03-24T11:37:02.921+0000 7fc0ca7e8640 0 -- 192.168.123.105:0/3996364971 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fc0a401f000 msgr2=0x7fc0a4023c50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:02.925 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 31% complete... Commit image migration: 34% complete... Commit image migration: 37% complete... Commit image migration: 40% complete... Commit image migration: 43% complete... Commit image migration: 46% complete... Commit image migration: 50% complete... Commit image migration: 53% complete... Commit image migration: 56% complete... Commit image migration: 59% complete... Commit image migration: 62% complete... Commit image migration: 65% complete... Commit image migration: 68% complete... Commit image migration: 71% complete... Commit image migration: 75% complete... Commit image migration: 78% complete... Commit image migration: 81% complete... Commit image migration: 84% complete... Commit image migration: 87% complete... Commit image migration: 90% complete... Commit image migration: 93% complete...2026-03-24T11:37:02.929+0000 7fc0ca7e8640 0 -- 192.168.123.105:0/3996364971 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fc0ac07d250 msgr2=0x7fc0ac09d630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:03.137 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 96% complete... Commit image migration: 100% complete...done. 2026-03-24T11:37:03.141 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 2026-03-24T11:37:03.195 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash mv test1 2026-03-24T11:37:03.195 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv test1 2026-03-24T11:37:03.223 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.225+0000 7f595216a200 -1 librbd::api::Trash: move: cannot move migrating image to trash 2026-03-24T11:37:03.224 INFO:tasks.workunit.client.0.vm05.stderr:rbd: deferred delete error: (16) Device or resource busy 2026-03-24T11:37:03.227 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:03.228 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls -a 2026-03-24T11:37:03.228 INFO:tasks.workunit.client.0.vm05.stderr:++ cut -d ' ' -f 1 2026-03-24T11:37:03.251 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=19bf40da5047 2026-03-24T11:37:03.251 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash rm 19bf40da5047 2026-03-24T11:37:03.251 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm 19bf40da5047 2026-03-24T11:37:03.278 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.281+0000 7fe10a41f640 -1 librbd::image::RefreshRequest: image being migrated 2026-03-24T11:37:03.278 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.281+0000 7fe10a41f640 -1 librbd::image::OpenRequest: failed to refresh image: (30) Read-only file system 2026-03-24T11:37:03.278 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.281+0000 7fe10a41f640 -1 librbd::ImageState: 0x7fe0e803c0e0 failed to open image: (30) Read-only file system 2026-03-24T11:37:03.278 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.281+0000 7fe0f0ff9640 -1 librbd::image::RemoveRequest: 0x7fe0e8000b90 handle_open_image: error opening image: (30) Read-only file system 2026-03-24T11:37:03.279 INFO:tasks.workunit.client.0.vm05.stderr:rbd: remove error: (30) Read-only file system 2026-03-24T11:37:03.279 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:37:03.282 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:03.282 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash restore 19bf40da5047 2026-03-24T11:37:03.283 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash restore 19bf40da5047 2026-03-24T11:37:03.305 INFO:tasks.workunit.client.0.vm05.stderr:rbd: restore error: (22) Invalid argument2026-03-24T11:37:03.305+0000 7f8f92a2b200 -1 librbd::api::Trash: restore: Current trash source 'migration' does not match expected: user,mirroring,unknown (4) 2026-03-24T11:37:03.305 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T11:37:03.308 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:03.308 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test1 2026-03-24T11:37:03.356 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:03.357+0000 7fea9a05c640 0 -- 192.168.123.105:0/1948469792 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x564b72bb5b60 msgr2=0x564b72c45eb0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:03.366 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:03.369+0000 7fea9a05c640 0 -- 192.168.123.105:0/1948469792 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fea7805c3d0 msgr2=0x7fea7807c7d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:03.372 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:03.375 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove test1 2026-03-24T11:37:03.434 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:03.437 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/dev/urandom bs=1M count=1 2026-03-24T11:37:03.438 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd --image-format 2 import - test1 2026-03-24T11:37:03.473 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records in 2026-03-24T11:37:03.473 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records out 2026-03-24T11:37:03.473 INFO:tasks.workunit.client.0.vm05.stderr:1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0345075 s, 30.4 MB/s 2026-03-24T11:37:03.492 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 100% complete...done. 2026-03-24T11:37:03.496 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export test1 - 2026-03-24T11:37:03.497 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:03.524 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:03.531 INFO:tasks.workunit.client.0.vm05.stderr:+ md5sum='731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:03.531 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@snap1 2026-03-24T11:37:03.592 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:37:03.598 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect test1@snap1 2026-03-24T11:37:03.629 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@snap2 2026-03-24T11:37:04.385 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:37:04.392 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@snap1 clone_v1 --rbd_default_clone_format=1 2026-03-24T11:37:04.437 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@snap2 clone_v2 --rbd_default_clone_format=2 2026-03-24T11:37:04.476 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v1 2026-03-24T11:37:04.477 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-24T11:37:04.505 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/test1@snap1 2026-03-24T11:37:04.505 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:04.505 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-24T11:37:04.535 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/test1@snap2 2026-03-24T11:37:04.535 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:04.535 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'op_features: clone-child' 2026-03-24T11:37:04.566 INFO:tasks.workunit.client.0.vm05.stdout: op_features: clone-child 2026-03-24T11:37:04.567 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v1 - 2026-03-24T11:37:04.567 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:04.600 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:04.603 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:04.604 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v2 - 2026-03-24T11:37:04.604 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:04.632 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:04.635 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:04.635 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children test1@snap1 2026-03-24T11:37:04.665 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T11:37:04.665 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children test1@snap2 2026-03-24T11:37:04.692 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T11:37:04.692 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test1 rbd2/test2 2026-03-24T11:37:04.755 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:04.757+0000 7f2178ccb640 0 -- 192.168.123.105:0/2895906271 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56283761be60 msgr2=0x5628376ac160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:05.590 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:05.593+0000 7f2178ccb640 0 -- 192.168.123.105:0/2895906271 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56283761be60 msgr2=0x7f215809d630 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:06.638 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v1 2026-03-24T11:37:06.638 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd2/test2@snap1' 2026-03-24T11:37:06.679 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd2/test2@snap1 2026-03-24T11:37:06.679 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:06.679 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd2/test2@snap2' 2026-03-24T11:37:06.719 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd2/test2@snap2 2026-03-24T11:37:06.719 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:06.720 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'op_features: clone-child' 2026-03-24T11:37:06.762 INFO:tasks.workunit.client.0.vm05.stdout: op_features: clone-child 2026-03-24T11:37:06.763 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children rbd2/test2@snap1 2026-03-24T11:37:06.803 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T11:37:06.803 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children rbd2/test2@snap2 2026-03-24T11:37:06.839 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T11:37:06.839 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:06.900 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:06.904 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd migration commit test1 2026-03-24T11:37:06.904 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:06.946 INFO:tasks.workunit.client.0.vm05.stderr:rbd: the image has descendants: 2026-03-24T11:37:06.946 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v1 2026-03-24T11:37:06.946 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v2 2026-03-24T11:37:06.946 INFO:tasks.workunit.client.0.vm05.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T11:37:06.946 INFO:tasks.workunit.client.0.vm05.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-24T11:37:06.949 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:06.949 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 --force 2026-03-24T11:37:06.991 INFO:tasks.workunit.client.0.vm05.stderr:rbd: the image has descendants: 2026-03-24T11:37:06.991 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v1 2026-03-24T11:37:06.991 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v2 2026-03-24T11:37:06.991 INFO:tasks.workunit.client.0.vm05.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T11:37:06.991 INFO:tasks.workunit.client.0.vm05.stderr:Proceeding anyway due to force flag set. 2026-03-24T11:37:06.995 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:06.997+0000 7f3acffff640 0 -- 192.168.123.105:0/1142967383 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f3aac00a8f0 msgr2=0x7f3aac02ad70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:07.001 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:07.005+0000 7f3ad57d7640 0 -- 192.168.123.105:0/1142967383 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x564bd0665c40 msgr2=0x564bd06f5690 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:08.613 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 100% complete...done. 2026-03-24T11:37:08.618 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v1 - 2026-03-24T11:37:08.618 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:08.650 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:08.654 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:08.654 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v2 - 2026-03-24T11:37:08.654 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:08.685 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:08.689 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:08.690 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare rbd2/test2 test1 2026-03-24T11:37:08.749 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:08.753+0000 7f82e24b6640 0 -- 192.168.123.105:0/3607790930 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f82c4008d30 msgr2=0x7f82c40291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:10.381 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:10.381+0000 7f82e24b6640 0 -- 192.168.123.105:0/3607790930 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x564182694080 msgr2=0x5641825daca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:10.433 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v1 2026-03-24T11:37:10.433 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd/test1@snap1' 2026-03-24T11:37:10.477 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/test1@snap1 2026-03-24T11:37:10.477 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:10.477 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'parent: rbd/test1@snap2' 2026-03-24T11:37:10.519 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/test1@snap2 2026-03-24T11:37:10.519 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info clone_v2 2026-03-24T11:37:10.519 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 'op_features: clone-child' 2026-03-24T11:37:10.559 INFO:tasks.workunit.client.0.vm05.stdout: op_features: clone-child 2026-03-24T11:37:10.559 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children test1@snap1 2026-03-24T11:37:10.598 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v1 = rbd/clone_v1 2026-03-24T11:37:10.598 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd children test1@snap2 2026-03-24T11:37:10.634 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd/clone_v2 = rbd/clone_v2 2026-03-24T11:37:10.634 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test1 2026-03-24T11:37:10.698 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:10.701+0000 7f8f5f405640 0 -- 192.168.123.105:0/232625459 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x559822c52b60 msgr2=0x559822ce2ef0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:10.701 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:10.705 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd migration commit test1 2026-03-24T11:37:10.705 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 2026-03-24T11:37:10.758 INFO:tasks.workunit.client.0.vm05.stderr:rbd: the image has descendants: 2026-03-24T11:37:10.758 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v1 2026-03-24T11:37:10.758 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v2 2026-03-24T11:37:10.758 INFO:tasks.workunit.client.0.vm05.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T11:37:10.758 INFO:tasks.workunit.client.0.vm05.stderr:Ensure no descendant images are opened read-only and run again with force flag. 2026-03-24T11:37:10.762 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:10.762 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration commit test1 --force 2026-03-24T11:37:10.821 INFO:tasks.workunit.client.0.vm05.stderr:rbd: the image has descendants: 2026-03-24T11:37:10.821 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v1 2026-03-24T11:37:10.821 INFO:tasks.workunit.client.0.vm05.stderr: rbd/clone_v2 2026-03-24T11:37:10.821 INFO:tasks.workunit.client.0.vm05.stderr:Warning: in-use, read-only descendant images will not detect the parent update. 2026-03-24T11:37:10.821 INFO:tasks.workunit.client.0.vm05.stderr:Proceeding anyway due to force flag set. 2026-03-24T11:37:10.822 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:10.825+0000 7fb66b9ee640 0 -- 192.168.123.105:0/3825383749 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5613da281c40 msgr2=0x5613da311690 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:10.829 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:10.833+0000 7fb66b9ee640 0 -- 192.168.123.105:0/3825383749 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb64805c330 msgr2=0x7fb64807c730 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:12.408 INFO:tasks.workunit.client.0.vm05.stderr: Commit image migration: 100% complete...done. 2026-03-24T11:37:12.412 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v1 - 2026-03-24T11:37:12.412 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:12.445 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:12.449 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:12.449 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export clone_v2 - 2026-03-24T11:37:12.449 INFO:tasks.workunit.client.0.vm05.stderr:++ md5sum 2026-03-24T11:37:12.479 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:12.484 INFO:tasks.workunit.client.0.vm05.stderr:+ test '731d00dce7bad38d3e9c4fb7b69a77b6 -' = '731d00dce7bad38d3e9c4fb7b69a77b6 -' 2026-03-24T11:37:12.484 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove clone_v1 2026-03-24T11:37:12.547 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:12.550 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove clone_v2 2026-03-24T11:37:12.611 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:12.613+0000 7f00c2cd1640 0 -- 192.168.123.105:0/1410492420 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f009c008d70 msgr2=0x7f009c0291f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:12.617 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:12.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect test1@snap1 2026-03-24T11:37:12.652 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge test1 2026-03-24T11:37:14.394 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:37:14.401 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:14.469 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:14.469+0000 7fe3b5544640 0 -- 192.168.123.105:0/934803832 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5576fcb4e320 msgr2=0x5576fcb2c790 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.471 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:14.475 INFO:tasks.workunit.client.0.vm05.stderr:+ for format in 1 2 2026-03-24T11:37:14.476 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T11:37:14.490 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:14.496 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:14.497+0000 7efe13fea200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:14.502 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T11:37:14.547 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:14.574 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:14.576 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:14.576 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:14.580 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:14.623 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:14.625+0000 7f7752a65640 0 -- 192.168.123.105:0/877178904 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f772c008d30 msgr2=0x7f772c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.635 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:14.638 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:14.664 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:14.666 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:14.666 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:14.670 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:14.705 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:14.708 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T11:37:14.724 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:14.730 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:14.733+0000 7f555690a200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:14.737 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T11:37:14.786 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:14.817 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:14.819 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:14.820 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:14.824 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test2 2026-03-24T11:37:14.880 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete...2026-03-24T11:37:14.881+0000 7fd8537fe640 0 -- 192.168.123.105:0/3525260722 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7fd834004a40 msgr2=0x7fd834025440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.886 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 96% complete...2026-03-24T11:37:14.889+0000 7fd853fff640 0 -- 192.168.123.105:0/3525260722 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fd830008d30 msgr2=0x7fd8300291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.889 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:14.894 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:14.940 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:14.941+0000 7efd4e33a640 0 -- 192.168.123.105:0/1033288416 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7efd2c004930 msgr2=0x7efd2c004dd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.952 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:14.953+0000 7efd4fdc4640 0 -- 192.168.123.105:0/1033288416 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7efd3005c4c0 msgr2=0x7efd3007c8e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:14.954 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:14.958 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:14.983 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:14.986 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:14.986 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:14.990 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:15.032 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:15.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T11:37:15.051 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:15.057 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.061+0000 7f293839b200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:15.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-24T11:37:15.095 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.097+0000 7f3af0568200 -1 librbd::image::CreateRequest: 0x5641e1434740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-24T11:37:15.095 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.097+0000 7f3af0568200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-24T11:37:15.101 INFO:tasks.workunit.client.0.vm05.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-24T11:37:15.104 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:15.104 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:15.129 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.132 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.132 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:15.137 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:15.176 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:15.179 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T11:37:15.195 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:15.201 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.205+0000 7fb56cd1d200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:15.208 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T11:37:15.259 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-24T11:37:15.288 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.291 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.291 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:15.296 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:15.351 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:15.353+0000 7fddda04d640 0 -- 192.168.123.105:0/3408759307 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5602a5630b60 msgr2=0x5602a5651010 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:15.352 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:15.355 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:15.380 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.382 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.382 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:15.386 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:15.422 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:15.426 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 1 test2 2026-03-24T11:37:15.441 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image format 1 is deprecated 2026-03-24T11:37:15.448 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.449+0000 7fc76d17c200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:15.455 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T11:37:15.501 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort rbd2/test2 2026-03-24T11:37:15.547 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:15.549+0000 7f8fab34a640 0 -- 192.168.123.105:0/2906625387 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f8f84008d30 msgr2=0x7f8f840291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:15.555 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:15.558 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:15.583 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.586 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.586 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:15.589 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:15.627 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:15.631 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 = 1 2026-03-24T11:37:15.631 INFO:tasks.workunit.client.0.vm05.stderr:+ continue 2026-03-24T11:37:15.631 INFO:tasks.workunit.client.0.vm05.stderr:+ for format in 1 2 2026-03-24T11:37:15.631 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:15.663 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T11:37:15.725 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:15.729+0000 7f46f6aec640 0 -- 192.168.123.105:0/2529413579 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x562b5f8d2f70 msgr2=0x562b5f968720 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:15.731 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:15.765 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.768 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.768 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:15.776 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:15.845 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:15.845+0000 7fa95e6c1640 0 -- 192.168.123.105:0/791496925 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x561e95aeb000 msgr2=0x561e95b0b3e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:15.848 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:15.852 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:15.883 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:15.884 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:15.884 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:15.890 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:15.953 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:15.953+0000 7f6890927640 0 -- 192.168.123.105:0/3258347283 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560487e5e5b0 msgr2=0x560487e4e4b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:15.957 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:15.961 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:15.995 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool rbd2 2026-03-24T11:37:16.060 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:16.061+0000 7f8f930c5640 0 -- 192.168.123.105:0/3295103575 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f8f6c008d30 msgr2=0x7f8f6c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.066 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:16.101 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:16.103 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:16.103 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:16.110 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration execute test2 2026-03-24T11:37:16.159 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 3% complete... Image migration: 6% complete... Image migration: 9% complete... Image migration: 12% complete... Image migration: 15% complete... Image migration: 18% complete... Image migration: 21% complete... Image migration: 25% complete... Image migration: 28% complete... Image migration: 31% complete... Image migration: 34% complete... Image migration: 37% complete... Image migration: 40% complete... Image migration: 43% complete... Image migration: 46% complete... Image migration: 50% complete... Image migration: 53% complete... Image migration: 56% complete... Image migration: 59% complete... Image migration: 62% complete... Image migration: 65% complete... Image migration: 68% complete... Image migration: 71% complete... Image migration: 75% complete... Image migration: 78% complete... Image migration: 81% complete... Image migration: 84% complete... Image migration: 87% complete... Image migration: 90% complete... Image migration: 93% complete... Image migration: 96% complete...2026-03-24T11:37:16.161+0000 7f8bbffff640 0 -- 192.168.123.105:0/1334306162 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f8b9c008d30 msgr2=0x7f8b9c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.166 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:16.169+0000 7f8bbffff640 0 -- 192.168.123.105:0/1334306162 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f8ba405c660 msgr2=0x7f8ba407ca80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.169 INFO:tasks.workunit.client.0.vm05.stderr: Image migration: 100% complete...done. 2026-03-24T11:37:16.173 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:16.239 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:16.241+0000 7f50f6204640 0 -- 192.168.123.105:0/491708332 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f50d405c600 msgr2=0x7f50d407ca20 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.244 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:16.247 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:16.277 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:16.278 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:16.278 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:16.285 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:16.349 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:16.349+0000 7ff43ffae640 0 -- 192.168.123.105:0/221127129 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55768bef5360 msgr2=0x55768bf26f50 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.355 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:16.358 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:16.393 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 --data-pool INVALID_DATA_POOL 2026-03-24T11:37:16.440 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:16.441+0000 7fc5a0f9c200 -1 librbd::image::CreateRequest: 0x560fb6a06740 validate_data_pool: data pool INVALID_DATA_POOL does not exist 2026-03-24T11:37:16.440 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:16.441+0000 7fc5a0f9c200 -1 librbd::Migration: create_dst_image: header creation failed: (2) No such file or directory 2026-03-24T11:37:16.455 INFO:tasks.workunit.client.0.vm05.stderr:rbd: preparing migration failed: (2) No such file or directory 2026-03-24T11:37:16.459 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:16.459 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:16.492 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:16.495 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:16.495 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:16.502 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:16.566 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:16.569+0000 7fc74757b640 0 -- 192.168.123.105:0/1252688176 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55fb3e81e320 msgr2=0x55fb3e853390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.573 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:16.576 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:16.610 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T11:37:16.673 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:16.673+0000 7fb016766640 0 -- 192.168.123.105:0/33476569 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x56539fe5ad20 msgr2=0x56539ff99f30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.681 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/test2 2026-03-24T11:37:16.715 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:16.718 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:16.718 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:16.726 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:16.794 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-24T11:37:16.798 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:16.829 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:16.830 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:16.830 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:16.838 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:16.905 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:16.909+0000 7f8e97dc9640 0 -- 192.168.123.105:0/2620670173 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55dc2ccc4320 msgr2=0x55dc2ccf9390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:16.911 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:16.914 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:16.945 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 rbd2/test2 2026-03-24T11:37:17.008 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:17.009+0000 7f6a9ea62640 0 -- 192.168.123.105:0/2839989542 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x55c1127fdf50 msgr2=0x55c11293fcd0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:17.016 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort rbd2/test2 2026-03-24T11:37:17.080 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete...2026-03-24T11:37:17.081+0000 7fba72895640 0 -- 192.168.123.105:0/4098436902 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x556e46a02d80 msgr2=0x556e46a23160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:17.080 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 100% complete...done. 2026-03-24T11:37:17.084 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:17.113 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:17.116 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:17.116 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 250.001 bytes/sec: 250 KiB/s 2026-03-24T11:37:17.122 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:17.182 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:17.185+0000 7f073c95a640 0 -- 192.168.123.105:0/4043493640 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55762f7f8320 msgr2=0x55762f82d390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:17.189 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:17.192 INFO:tasks.workunit.client.0.vm05.stderr:+ test 2 = 1 2026-03-24T11:37:17.192 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 128M --image-format 2 test2 2026-03-24T11:37:17.225 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration prepare test2 rbd2/ns1/test3 2026-03-24T11:37:17.287 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:17.289+0000 7f9643c5f640 0 -- 192.168.123.105:0/3381485345 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x55a88206ce60 msgr2=0x55a88209fc10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:17.295 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 rbd2/ns1/test3 2026-03-24T11:37:17.329 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:17.332 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:17.333 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: inf bytes/sec: 0 B/s 2026-03-24T11:37:17.339 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd migration abort test2 2026-03-24T11:37:17.436 INFO:tasks.workunit.client.0.vm05.stderr: Abort image migration: 1% complete... Abort image migration: 3% complete... Abort image migration: 4% complete... Abort image migration: 6% complete... Abort image migration: 7% complete... Abort image migration: 9% complete... Abort image migration: 10% complete... Abort image migration: 12% complete... Abort image migration: 14% complete... Abort image migration: 15% complete... Abort image migration: 17% complete... Abort image migration: 18% complete... Abort image migration: 20% complete... Abort image migration: 21% complete... Abort image migration: 23% complete... Abort image migration: 25% complete... Abort image migration: 26% complete... Abort image migration: 28% complete... Abort image migration: 29% complete... Abort image migration: 31% complete... Abort image migration: 32% complete... Abort image migration: 34% complete... Abort image migration: 35% complete... Abort image migration: 37% complete... Abort image migration: 39% complete... Abort image migration: 40% complete... Abort image migration: 42% complete... Abort image migration: 43% complete... Abort image migration: 45% complete... Abort image migration: 46% complete... Abort image migration: 48% complete... Abort image migration: 50% complete... Abort image migration: 51% complete... Abort image migration: 53% complete... Abort image migration: 54% complete... Abort image migration: 56% complete... Abort image migration: 57% complete... Abort image migration: 59% complete... Abort image migration: 60% complete... Abort image migration: 62% complete... Abort image migration: 64% complete... Abort image migration: 65% complete... Abort image migration: 67% complete... Abort image migration: 68% complete... Abort image migration: 70% complete... Abort image migration: 71% complete... Abort image migration: 73% complete... Abort image migration: 75% complete... Abort image migration: 76% complete... Abort image migration: 78% complete... Abort image migration: 79% complete... Abort image migration: 81% complete... Abort image migration: 82% complete... Abort image migration: 84% complete... Abort image migration: 85% complete... Abort image migration: 87% complete... Abort image migration: 89% complete... Abort image migration: 90% complete... Abort image migration: 92% complete... Abort image migration: 93% complete... Abort image migration: 95% complete... Abort image migration: 96% complete... Abort image migration: 98% complete... Abort image migration: 100% complete...done. 2026-03-24T11:37:17.440 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-size 1024 --io-total 1024 test2 2026-03-24T11:37:17.510 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1024 io_threads 16 bytes 1024 pattern sequential 2026-03-24T11:37:17.587 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:37:17.587 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 0 ops: 1 ops/sec: 13.1579 bytes/sec: 13 KiB/s 2026-03-24T11:37:17.597 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:37:17.660 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:37:17.661+0000 7faaad2e5640 0 -- 192.168.123.105:0/3499165522 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5605e2141320 msgr2=0x5605e2176390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:17.666 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:17.670 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:17.670 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:17.755 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:17.817 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:17.881 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:17.946 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.005 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.673 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.733 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.793 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.855 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.916 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:18.974 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.236 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.294 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.350 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.412 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.473 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.533 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:19.592 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T11:37:20.458 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T11:37:20.469 INFO:tasks.workunit.client.0.vm05.stderr:+ test_config 2026-03-24T11:37:20.469 INFO:tasks.workunit.client.0.vm05.stdout:testing config... 2026-03-24T11:37:20.469 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing config...' 2026-03-24T11:37:20.470 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:20.470 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:20.533 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:20.799 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:20.857 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:20.913 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:20.968 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.022 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.076 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.133 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.190 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.245 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.299 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.352 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.406 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.474 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.528 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.582 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.637 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:21.690 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global set osd rbd_cache true 2026-03-24T11:37:21.690 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set osd rbd_cache true 2026-03-24T11:37:21.704 INFO:tasks.workunit.client.0.vm05.stderr:rbd: invalid config entity: osd (must be global, client or client.) 2026-03-24T11:37:21.705 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:21.705 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global set global debug_ms 10 2026-03-24T11:37:21.705 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set global debug_ms 10 2026-03-24T11:37:21.719 INFO:tasks.workunit.client.0.vm05.stderr:rbd: not rbd option: debug_ms 2026-03-24T11:37:21.720 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:21.720 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global set global rbd_UNKNOWN false 2026-03-24T11:37:21.720 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set global rbd_UNKNOWN false 2026-03-24T11:37:21.733 INFO:tasks.workunit.client.0.vm05.stderr:rbd: invalid config key: rbd_UNKNOWN 2026-03-24T11:37:21.735 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:21.735 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global set global rbd_cache INVALID 2026-03-24T11:37:21.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set global rbd_cache INVALID 2026-03-24T11:37:21.753 INFO:tasks.workunit.client.0.vm05.stderr:rbd: error setting rbd_cache: error parsing value: Expected option value to be integer, got 'INVALID' 2026-03-24T11:37:21.755 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:21.755 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set global rbd_cache false 2026-03-24T11:37:21.782 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set client rbd_cache true 2026-03-24T11:37:21.806 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global set client.123 rbd_cache false 2026-03-24T11:37:21.831 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global get global rbd_cache 2026-03-24T11:37:21.831 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^false$' 2026-03-24T11:37:21.851 INFO:tasks.workunit.client.0.vm05.stdout:false 2026-03-24T11:37:21.852 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global get client rbd_cache 2026-03-24T11:37:21.852 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^true$' 2026-03-24T11:37:21.872 INFO:tasks.workunit.client.0.vm05.stdout:true 2026-03-24T11:37:21.872 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global get client.123 rbd_cache 2026-03-24T11:37:21.872 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^false$' 2026-03-24T11:37:21.892 INFO:tasks.workunit.client.0.vm05.stdout:false 2026-03-24T11:37:21.892 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global get client.UNKNOWN rbd_cache 2026-03-24T11:37:21.892 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global get client.UNKNOWN rbd_cache 2026-03-24T11:37:21.910 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd_cache is not set 2026-03-24T11:37:21.912 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:21.912 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global list global 2026-03-24T11:37:21.912 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-24T11:37:21.932 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache false global 2026-03-24T11:37:21.932 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global list client 2026-03-24T11:37:21.932 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-24T11:37:21.952 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true client 2026-03-24T11:37:21.952 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global list client.123 2026-03-24T11:37:21.952 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * false * client.123 *$' 2026-03-24T11:37:21.973 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache false client.123 2026-03-24T11:37:21.973 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global list client.UNKNOWN 2026-03-24T11:37:21.973 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * client *$' 2026-03-24T11:37:21.993 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true client 2026-03-24T11:37:21.993 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global rm client rbd_cache 2026-03-24T11:37:22.015 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config global get client rbd_cache 2026-03-24T11:37:22.015 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global get client rbd_cache 2026-03-24T11:37:22.034 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd_cache is not set 2026-03-24T11:37:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global list client 2026-03-24T11:37:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * false * global *$' 2026-03-24T11:37:22.058 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache false global 2026-03-24T11:37:22.059 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global rm client.123 rbd_cache 2026-03-24T11:37:22.083 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config global rm global rbd_cache 2026-03-24T11:37:22.112 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool set rbd rbd_cache true 2026-03-24T11:37:22.142 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool list rbd 2026-03-24T11:37:22.143 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T11:37:22.165 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true pool 2026-03-24T11:37:22.166 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool get rbd rbd_cache 2026-03-24T11:37:22.166 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^true$' 2026-03-24T11:37:22.191 INFO:tasks.workunit.client.0.vm05.stdout:true 2026-03-24T11:37:22.191 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 1 test1 2026-03-24T11:37:22.211 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:22.213+0000 7fa149369200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:22.217 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image list rbd/test1 2026-03-24T11:37:22.217 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T11:37:22.244 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true pool 2026-03-24T11:37:22.244 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image set rbd/test1 rbd_cache false 2026-03-24T11:37:22.274 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image list rbd/test1 2026-03-24T11:37:22.274 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * false * image *$' 2026-03-24T11:37:22.504 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache false image 2026-03-24T11:37:22.504 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-24T11:37:22.504 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^false$' 2026-03-24T11:37:22.530 INFO:tasks.workunit.client.0.vm05.stdout:false 2026-03-24T11:37:22.530 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image remove rbd/test1 rbd_cache 2026-03-24T11:37:22.560 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config image get rbd/test1 rbd_cache 2026-03-24T11:37:22.560 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image get rbd/test1 rbd_cache 2026-03-24T11:37:22.583 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd_cache is not set 2026-03-24T11:37:22.587 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:22.587 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config image list rbd/test1 2026-03-24T11:37:22.587 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * pool *$' 2026-03-24T11:37:22.616 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true pool 2026-03-24T11:37:22.616 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool remove rbd rbd_cache 2026-03-24T11:37:22.641 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd config pool get rbd rbd_cache 2026-03-24T11:37:22.641 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool get rbd rbd_cache 2026-03-24T11:37:22.662 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd_cache is not set 2026-03-24T11:37:22.664 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:37:22.664 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd config pool list rbd 2026-03-24T11:37:22.664 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^rbd_cache * true * config *$' 2026-03-24T11:37:22.686 INFO:tasks.workunit.client.0.vm05.stdout:rbd_cache true config 2026-03-24T11:37:22.686 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:22.713 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:22.716 INFO:tasks.workunit.client.0.vm05.stderr:+ RBD_CREATE_ARGS= 2026-03-24T11:37:22.716 INFO:tasks.workunit.client.0.vm05.stderr:+ test_others 2026-03-24T11:37:22.716 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-24T11:37:22.716 INFO:tasks.workunit.client.0.vm05.stdout:testing import, export, resize, and snapshots... 2026-03-24T11:37:22.717 INFO:tasks.workunit.client.0.vm05.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-24T11:37:22.717 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:22.717 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:22.776 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:22.833 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:22.889 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:22.946 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.003 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.065 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.120 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.177 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.234 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.299 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.356 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.413 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.470 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.525 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.581 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.637 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.694 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:23.748 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T11:37:23.749 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-24T11:37:23.749 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records in 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records out 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 6.2718e-05 s, 16.3 MB/s 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:10+0 records in 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:10+0 records out 2026-03-24T11:37:23.750 INFO:tasks.workunit.client.0.vm05.stderr:10240 bytes (10 kB, 10 KiB) copied, 7.7074e-05 s, 133 MB/s 2026-03-24T11:37:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-24T11:37:23.751 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records in 2026-03-24T11:37:23.751 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records out 2026-03-24T11:37:23.751 INFO:tasks.workunit.client.0.vm05.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000178433 s, 336 MB/s 2026-03-24T11:37:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-24T11:37:23.752 INFO:tasks.workunit.client.0.vm05.stderr:134+1 records in 2026-03-24T11:37:23.752 INFO:tasks.workunit.client.0.vm05.stderr:134+1 records out 2026-03-24T11:37:23.752 INFO:tasks.workunit.client.0.vm05.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.000382354 s, 361 MB/s 2026-03-24T11:37:23.752 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-24T11:37:23.753 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records in 2026-03-24T11:37:23.753 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records out 2026-03-24T11:37:23.753 INFO:tasks.workunit.client.0.vm05.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000181589 s, 330 MB/s 2026-03-24T11:37:23.753 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import /tmp/img1 testimg1 2026-03-24T11:37:23.772 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:23.773+0000 7f1171876200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:23.839 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-24T11:37:23.843 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-24T11:37:23.868 INFO:tasks.workunit.client.0.vm05.stderr: Resizing image: 100% complete...done. 2026-03-24T11:37:23.872 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img2 2026-03-24T11:37:23.937 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:23.941 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T11:37:24.446 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:37:24.451 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=128 2026-03-24T11:37:24.474 INFO:tasks.workunit.client.0.vm05.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-24T11:37:24.477 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:37:24.477 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-24T11:37:24.513 INFO:tasks.workunit.client.0.vm05.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-24T11:37:24.518 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img3 2026-03-24T11:37:24.562 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:24.568 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:37:24.568 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:24.592 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:24.592 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:37:24.592 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:24.616 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:24.616 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-24T11:37:24.617 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-24T11:37:24.648 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:24.652 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-24T11:37:24.676 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:37:24.679 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --size=1 testimg-diff1 2026-03-24T11:37:24.698 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:24.701+0000 7f880cf92200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:24.705 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-24T11:37:25.453 INFO:tasks.workunit.client.0.vm05.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-24T11:37:25.458 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-24T11:37:25.499 INFO:tasks.workunit.client.0.vm05.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-24T11:37:25.504 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:37:25.504 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:25.735 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:25.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:37:25.735 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:25.759 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:25.759 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff1 2026-03-24T11:37:25.759 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:25.783 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:25.783 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T11:37:25.783 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:25.807 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:25.807 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-24T11:37:25.930 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:37:25.935 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg1 testimg3 2026-03-24T11:37:25.985 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:37:25.995 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-24T11:37:26.060 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 1% complete... Image copy: 3% complete... Image copy: 4% complete... Image copy: 6% complete... Image copy: 7% complete... Image copy: 9% complete... Image copy: 10% complete... Image copy: 12% complete... Image copy: 14% complete... Image copy: 15% complete... Image copy: 17% complete... Image copy: 18% complete... Image copy: 20% complete... Image copy: 21% complete... Image copy: 23% complete... Image copy: 25% complete... Image copy: 26% complete... Image copy: 28% complete... Image copy: 29% complete... Image copy: 31% complete... Image copy: 32% complete... Image copy: 34% complete... Image copy: 35% complete... Image copy: 37% complete... Image copy: 39% complete... Image copy: 40% complete... Image copy: 42% complete... Image copy: 43% complete... Image copy: 45% complete... Image copy: 46% complete... Image copy: 48% complete... Image copy: 50% complete... Image copy: 51% complete... Image copy: 53% complete... Image copy: 54% complete... Image copy: 56% complete... Image copy: 57% complete... Image copy: 59% complete... Image copy: 60% complete... Image copy: 62% complete... Image copy: 64% complete... Image copy: 65% complete... Image copy: 67% complete... Image copy: 68% complete... Image copy: 70% complete... Image copy: 71% complete... Image copy: 73% complete... Image copy: 75% complete... Image copy: 76% complete... Image copy: 78% complete... Image copy: 79% complete... Image copy: 81% complete... Image copy: 82% complete... Image copy: 84% complete... Image copy: 85% complete... Image copy: 87% complete... Image copy: 89% complete... Image copy: 90% complete... Image copy: 92% complete... Image copy: 93% complete... Image copy: 95% complete... Image copy: 96% complete... Image copy: 98% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:37:26.065 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-24T11:37:26.128 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:37:26.133 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg2 2026-03-24T11:37:26.133 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:26.158 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:26.158 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:37:26.158 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:26.184 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:26.184 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff2 2026-03-24T11:37:26.184 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:26.209 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:26.210 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff3 2026-03-24T11:37:26.210 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:26.235 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:26.236 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy testimg1 testimg4 2026-03-24T11:37:26.752 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T11:37:26.755 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-24T11:37:27.750 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T11:37:27.754 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg4 2026-03-24T11:37:27.754 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:37:27.788 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:37:27.789 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg5 2026-03-24T11:37:27.789 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:27.815 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:27.815 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg4 2026-03-24T11:37:27.815 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:37:27.815 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:37:27.816 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:37:27.840 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:37:27.840 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg4 2026-03-24T11:37:27.840 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '.*snap1.*' 2026-03-24T11:37:27.864 INFO:tasks.workunit.client.0.vm05.stdout: 11 snap1 256 MiB Tue Mar 24 11:37:26 2026 2026-03-24T11:37:27.864 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-24T11:37:27.906 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:27.911 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-24T11:37:27.967 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:27.972 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-24T11:37:28.017 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.024 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-24T11:37:28.073 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.078 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-24T11:37:28.147 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.152 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-24T11:37:28.202 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.207 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-24T11:37:28.357 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-24T11:37:28.430 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-24T11:37:28.547 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-24T11:37:28.602 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-24T11:37:28.648 INFO:tasks.workunit.client.0.vm05.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T11:37:28.653 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-24T11:37:28.701 INFO:tasks.workunit.client.0.vm05.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T11:37:28.705 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:37:28.706 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:28.729 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:28.730 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff1 2026-03-24T11:37:28.730 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:37:28.753 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:37:28.753 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-24T11:37:28.803 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.810 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-24T11:37:28.869 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:37:28.873 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-24T11:37:28.984 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-24T11:37:29.103 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg2 2026-03-24T11:37:29.171 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete...2026-03-24T11:37:29.173+0000 7ff9b1769640 0 -- 192.168.123.105:0/1733426249 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7ff99005bcf0 msgr2=0x7ff99007c0f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:29.177 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T11:37:29.181+0000 7ff9ab7fe640 0 -- 192.168.123.105:0/1733426249 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7ff98c00a4d0 msgr2=0x7ff98c02a8b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:29.184 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:29.188 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg3 2026-03-24T11:37:29.255 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete... Removing image: 100% complete...done. 2026-03-24T11:37:29.259 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create testimg2 -s 0 2026-03-24T11:37:29.280 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:29.281+0000 7feb7818a200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:29.286 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd cp testimg2 testimg3 2026-03-24T11:37:29.322 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete...done. 2026-03-24T11:37:29.325 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep cp testimg2 testimg6 2026-03-24T11:37:29.559 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 100% complete...done. 2026-03-24T11:37:29.563 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-24T11:37:30.394 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:37:30.399 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-24T11:37:31.397 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:37:31.402 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:37:31.402 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T11:37:31.427 INFO:tasks.workunit.client.0.vm05.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T11:37:31.428 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T11:37:31.428 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T11:37:31.452 INFO:tasks.workunit.client.0.vm05.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T11:37:31.452 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd sparsify testimg1 2026-03-24T11:37:31.500 INFO:tasks.workunit.client.0.vm05.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-24T11:37:31.505 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:31.505 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:31.584 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:31.645 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:31.727 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:32.472 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.492 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.578 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.675 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.784 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.882 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:33.942 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.005 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.075 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.139 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.201 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.263 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.325 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.388 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.696 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T11:37:34.817 INFO:tasks.workunit.client.0.vm05.stdout:testing locking... 2026-03-24T11:37:34.817 INFO:tasks.workunit.client.0.vm05.stderr:+ test_locking 2026-03-24T11:37:34.817 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing locking...' 2026-03-24T11:37:34.817 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:34.817 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.875 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.933 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:34.995 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.056 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.117 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.178 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.240 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.301 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.368 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.427 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.488 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.748 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.807 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.864 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:35.925 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:36.193 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:36.255 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:36.319 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create -s 1 test1 2026-03-24T11:37:36.341 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:36.341+0000 7fd076786200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:36.348 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.348 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:37:36.348 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:37:36.374 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:37:36.374 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id 2026-03-24T11:37:36.405 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.405 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 1 ' 2026-03-24T11:37:36.430 INFO:tasks.workunit.client.0.vm05.stdout:There is 1 exclusive lock on this image. 2026-03-24T11:37:36.430 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd lock list test1 2026-03-24T11:37:36.430 INFO:tasks.workunit.client.0.vm05.stderr:++ tail -n 1 2026-03-24T11:37:36.430 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{print $1;}' 2026-03-24T11:37:36.458 INFO:tasks.workunit.client.0.vm05.stderr:+ LOCKER=client.7716 2026-03-24T11:37:36.458 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock remove test1 id client.7716 2026-03-24T11:37:36.670 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.670 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:37:36.670 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:37:36.696 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:37:36.696 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T11:37:36.728 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.728 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 1 ' 2026-03-24T11:37:36.753 INFO:tasks.workunit.client.0.vm05.stdout:There is 1 shared lock on this image. 2026-03-24T11:37:36.753 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T11:37:36.785 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.785 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 2 ' 2026-03-24T11:37:36.811 INFO:tasks.workunit.client.0.vm05.stdout:There are 2 shared locks on this image. 2026-03-24T11:37:36.811 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-24T11:37:36.851 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.851 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 3 ' 2026-03-24T11:37:36.877 INFO:tasks.workunit.client.0.vm05.stdout:There are 3 shared locks on this image. 2026-03-24T11:37:36.877 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:37:36.877 INFO:tasks.workunit.client.0.vm05.stderr:+ tail -n 1 2026-03-24T11:37:36.877 INFO:tasks.workunit.client.0.vm05.stderr:+ awk '{print $2, $1;}' 2026-03-24T11:37:36.877 INFO:tasks.workunit.client.0.vm05.stderr:+ xargs rbd lock remove test1 2026-03-24T11:37:37.673 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:37:37.673 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -qE 'features:.*exclusive' 2026-03-24T11:37:37.696 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:37.725 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:37:37.728 INFO:tasks.workunit.client.0.vm05.stderr:+ test_thick_provision 2026-03-24T11:37:37.728 INFO:tasks.workunit.client.0.vm05.stdout:testing thick provision... 2026-03-24T11:37:37.728 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing thick provision...' 2026-03-24T11:37:37.728 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:37:37.728 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:37.786 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:37.842 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:37.901 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:37.962 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.022 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.101 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.161 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.223 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.281 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.340 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.398 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.455 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.514 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.572 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.694 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.762 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:37:38.822 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --thick-provision -s 64M test1 2026-03-24T11:37:38.843 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:38.845+0000 7f580335c200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:39.119 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ count=0 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ ret= 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ tr -s ' ' 2026-03-24T11:37:39.126 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^64 MiB' 2026-03-24T11:37:39.127 INFO:tasks.workunit.client.0.vm05.stderr:+ cut -d ' ' -f 4-5 2026-03-24T11:37:39.148 INFO:tasks.workunit.client.0.vm05.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T11:37:39.154 INFO:tasks.workunit.client.0.vm05.stdout:64 MiB 2026-03-24T11:37:39.154 INFO:tasks.workunit.client.0.vm05.stderr:+ ret=0 2026-03-24T11:37:39.154 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 = 0 ']' 2026-03-24T11:37:39.154 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:37:39.154 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:37:39.176 INFO:tasks.workunit.client.0.vm05.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T11:37:39.177 INFO:tasks.workunit.client.0.vm05.stdout:NAME PROVISIONED USED 2026-03-24T11:37:39.177 INFO:tasks.workunit.client.0.vm05.stdout:test1 64 MiB 64 MiB 2026-03-24T11:37:39.181 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T11:37:39.181 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:39.245 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete... Removing image: 100% complete...done. 2026-03-24T11:37:39.249 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:37:39.249 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:37:39.249 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:37:39.249 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:37:39.273 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:37:39.273 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --thick-provision -s 4G test1 2026-03-24T11:37:39.294 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:39.297+0000 7f3dad972200 -1 librbd: Forced V1 image creation. 2026-03-24T11:37:40.947 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete... Thick provisioning: 6% complete... Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete...2026-03-24T11:37:40.949+0000 7f3dad6cf640 0 -- 192.168.123.105:0/564273858 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x564372c065d0 msgr2=0x564372bdf340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:41.293 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 11% complete... Thick provisioning: 12% complete...2026-03-24T11:37:41.293+0000 7f3dad6cf640 0 -- 192.168.123.105:0/564273858 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f3d8805be60 msgr2=0x7f3d8807c240 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:57.906 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T11:37:57.912 INFO:tasks.workunit.client.0.vm05.stderr:+ count=0 2026-03-24T11:37:57.912 INFO:tasks.workunit.client.0.vm05.stderr:+ ret= 2026-03-24T11:37:57.912 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T11:37:57.912 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:37:57.913 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:37:57.913 INFO:tasks.workunit.client.0.vm05.stderr:+ tr -s ' ' 2026-03-24T11:37:57.913 INFO:tasks.workunit.client.0.vm05.stderr:+ cut -d ' ' -f 4-5 2026-03-24T11:37:57.917 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^4 GiB' 2026-03-24T11:37:57.936 INFO:tasks.workunit.client.0.vm05.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T11:37:57.942 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:57.945+0000 7f1c9ae20640 0 -- 192.168.123.105:0/562700728 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x55fdbcad4b90 msgr2=0x55fdbcaf5040 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:37:57.944 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:57.945+0000 7f1c9c8aa640 0 -- 192.168.123.105:0/562700728 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55fdbca18fe0 msgr2=0x55fdbca59af0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:58.004 INFO:tasks.workunit.client.0.vm05.stdout:4 GiB 2026-03-24T11:37:58.004 INFO:tasks.workunit.client.0.vm05.stderr:+ ret=0 2026-03-24T11:37:58.004 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 = 0 ']' 2026-03-24T11:37:58.005 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:37:58.005 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:37:58.027 INFO:tasks.workunit.client.0.vm05.stderr:warning: fast-diff map is not enabled for test1. operation may be slow. 2026-03-24T11:37:58.032 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:58.033+0000 7f3c44cee640 0 -- 192.168.123.105:0/3188125315 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f3c2405bc40 msgr2=0x7f3c2407c040 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:58.034 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:37:58.037+0000 7f3c44cee640 0 -- 192.168.123.105:0/3188125315 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x559a0b170c00 msgr2=0x7f3c2407c7a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:58.095 INFO:tasks.workunit.client.0.vm05.stdout:NAME PROVISIONED USED 2026-03-24T11:37:58.095 INFO:tasks.workunit.client.0.vm05.stdout:test1 4 GiB 4 GiB 2026-03-24T11:37:58.098 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T11:37:58.098 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:37:58.344 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete...2026-03-24T11:37:58.345+0000 7fc1b72ce640 0 -- 192.168.123.105:0/1731117344 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55a8f61cd320 msgr2=0x55a8f62028a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:37:58.376 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 11% complete... Removing image: 12% complete...2026-03-24T11:37:58.377+0000 7fc1b6045640 0 -- 192.168.123.105:0/1731117344 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fc18c009430 msgr2=0x7fc18c029810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:38:00.188 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T11:38:00.191 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:38:00.192 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:38:00.192 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:38:00.192 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:38:00.213 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ RBD_CREATE_ARGS='--image-format 2' 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ test_others 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stdout:testing import, export, resize, and snapshots... 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing import, export, resize, and snapshots...' 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ TMP_FILES='/tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1' 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:38:00.214 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.279 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.337 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.394 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.451 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.509 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.570 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.833 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.896 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:00.960 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.023 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.083 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.145 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.206 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.268 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.326 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.383 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.443 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:01.508 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T11:38:01.509 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/sh of=/tmp/img1 bs=1k count=1 seek=10 2026-03-24T11:38:01.510 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records in 2026-03-24T11:38:01.510 INFO:tasks.workunit.client.0.vm05.stderr:1+0 records out 2026-03-24T11:38:01.510 INFO:tasks.workunit.client.0.vm05.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 7.3228e-05 s, 14.0 MB/s 2026-03-24T11:38:01.510 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/dd of=/tmp/img1 bs=1k count=10 seek=100 2026-03-24T11:38:01.511 INFO:tasks.workunit.client.0.vm05.stderr:10+0 records in 2026-03-24T11:38:01.511 INFO:tasks.workunit.client.0.vm05.stderr:10+0 records out 2026-03-24T11:38:01.511 INFO:tasks.workunit.client.0.vm05.stderr:10240 bytes (10 kB, 10 KiB) copied, 7.7615e-05 s, 132 MB/s 2026-03-24T11:38:01.511 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/rm of=/tmp/img1 bs=1k count=100 seek=1000 2026-03-24T11:38:01.512 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records in 2026-03-24T11:38:01.512 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records out 2026-03-24T11:38:01.512 INFO:tasks.workunit.client.0.vm05.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.000191368 s, 313 MB/s 2026-03-24T11:38:01.512 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/ls of=/tmp/img1 bs=1k seek=10000 2026-03-24T11:38:01.514 INFO:tasks.workunit.client.0.vm05.stderr:134+1 records in 2026-03-24T11:38:01.514 INFO:tasks.workunit.client.0.vm05.stderr:134+1 records out 2026-03-24T11:38:01.514 INFO:tasks.workunit.client.0.vm05.stderr:138216 bytes (138 kB, 135 KiB) copied, 0.00120206 s, 115 MB/s 2026-03-24T11:38:01.514 INFO:tasks.workunit.client.0.vm05.stderr:+ dd if=/bin/ln of=/tmp/img1 bs=1k seek=100000 2026-03-24T11:38:01.515 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records in 2026-03-24T11:38:01.515 INFO:tasks.workunit.client.0.vm05.stderr:58+1 records out 2026-03-24T11:38:01.515 INFO:tasks.workunit.client.0.vm05.stderr:59912 bytes (60 kB, 59 KiB) copied, 0.00019304 s, 310 MB/s 2026-03-24T11:38:01.515 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import --image-format 2 /tmp/img1 testimg1 2026-03-24T11:38:01.653 INFO:tasks.workunit.client.0.vm05.stderr: Importing image: 4% complete... Importing image: 8% complete... Importing image: 12% complete... Importing image: 16% complete... Importing image: 20% complete... Importing image: 24% complete... Importing image: 28% complete... Importing image: 32% complete... Importing image: 36% complete... Importing image: 40% complete... Importing image: 45% complete... Importing image: 49% complete... Importing image: 53% complete... Importing image: 57% complete... Importing image: 61% complete... Importing image: 65% complete... Importing image: 69% complete... Importing image: 73% complete... Importing image: 77% complete... Importing image: 81% complete... Importing image: 85% complete... Importing image: 90% complete... Importing image: 94% complete... Importing image: 98% complete... Importing image: 100% complete...done. 2026-03-24T11:38:01.657 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=256 --allow-shrink 2026-03-24T11:38:01.690 INFO:tasks.workunit.client.0.vm05.stderr: Resizing image: 100% complete...done. 2026-03-24T11:38:01.697 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img2 2026-03-24T11:38:01.767 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:01.773 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T11:38:02.473 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:38:02.481 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=128 2026-03-24T11:38:02.508 INFO:tasks.workunit.client.0.vm05.stderr:rbd: shrinking an image is only allowed with the --allow-shrink flag 2026-03-24T11:38:02.512 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:38:02.512 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd resize testimg1 --size=128 --allow-shrink 2026-03-24T11:38:02.545 INFO:tasks.workunit.client.0.vm05.stderr: Resizing image: 50% complete... Resizing image: 51% complete... Resizing image: 53% complete... Resizing image: 54% complete... Resizing image: 56% complete... Resizing image: 57% complete... Resizing image: 59% complete... Resizing image: 60% complete... Resizing image: 62% complete... Resizing image: 64% complete... Resizing image: 65% complete... Resizing image: 67% complete... Resizing image: 68% complete... Resizing image: 70% complete... Resizing image: 71% complete... Resizing image: 73% complete... Resizing image: 75% complete... Resizing image: 76% complete... Resizing image: 78% complete... Resizing image: 79% complete... Resizing image: 81% complete... Resizing image: 82% complete... Resizing image: 84% complete... Resizing image: 85% complete... Resizing image: 87% complete... Resizing image: 89% complete... Resizing image: 90% complete... Resizing image: 92% complete... Resizing image: 93% complete... Resizing image: 95% complete... Resizing image: 96% complete... Resizing image: 98% complete... Resizing image: 100% complete...done. 2026-03-24T11:38:02.551 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img3 2026-03-24T11:38:02.611 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:02.617 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:38:02.617 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:02.645 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:02.645 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:38:02.645 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:02.673 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:02.673 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -rf /tmp/diff-testimg1-1 /tmp/diff-testimg1-2 2026-03-24T11:38:02.675 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export-diff testimg1 --snap=snap1 /tmp/diff-testimg1-1 2026-03-24T11:38:02.708 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 37% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:02.712 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export-diff testimg1 --from-snap=snap1 /tmp/diff-testimg1-2 2026-03-24T11:38:02.739 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:38:02.743 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size=1 testimg-diff1 2026-03-24T11:38:02.797 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-1 testimg-diff1 2026-03-24T11:38:03.480 INFO:tasks.workunit.client.0.vm05.stderr: Importing image diff: 22% complete... Importing image diff: 63% complete... Importing image diff: 99% complete... Importing image diff: 100% complete...done. 2026-03-24T11:38:03.488 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd import-diff --sparse-size 8K /tmp/diff-testimg1-2 testimg-diff1 2026-03-24T11:38:03.524 INFO:tasks.workunit.client.0.vm05.stderr: Importing image diff: 68% complete... Importing image diff: 96% complete... Importing image diff: 100% complete...done. 2026-03-24T11:38:03.531 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:38:03.531 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:03.557 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:03.557 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:38:03.557 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:03.583 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:03.584 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff1 2026-03-24T11:38:03.584 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:03.609 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:03.609 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T11:38:03.609 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:03.637 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:03.637 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg1 --snap=snap1 testimg2 2026-03-24T11:38:03.694 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:38:03.699 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg1 testimg3 2026-03-24T11:38:03.762 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:38:03.767 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg-diff1 --sparse-size 768K --snap=snap1 testimg-diff2 2026-03-24T11:38:03.828 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 37% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:38:03.833 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd copy testimg-diff1 --sparse-size 768K testimg-diff3 2026-03-24T11:38:03.900 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 3% complete... Image copy: 6% complete... Image copy: 9% complete... Image copy: 12% complete... Image copy: 15% complete... Image copy: 18% complete... Image copy: 21% complete... Image copy: 25% complete... Image copy: 28% complete... Image copy: 31% complete... Image copy: 34% complete... Image copy: 37% complete... Image copy: 40% complete... Image copy: 43% complete... Image copy: 46% complete... Image copy: 50% complete... Image copy: 53% complete... Image copy: 56% complete... Image copy: 59% complete... Image copy: 62% complete... Image copy: 65% complete... Image copy: 68% complete... Image copy: 71% complete... Image copy: 75% complete... Image copy: 78% complete... Image copy: 81% complete... Image copy: 84% complete... Image copy: 87% complete... Image copy: 90% complete... Image copy: 93% complete... Image copy: 96% complete... Image copy: 100% complete... Image copy: 100% complete...done. 2026-03-24T11:38:03.906 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg2 2026-03-24T11:38:03.906 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:03.932 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:03.932 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:38:03.932 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:03.957 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:03.957 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff2 2026-03-24T11:38:03.957 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:03.981 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:03.982 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff3 2026-03-24T11:38:03.982 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:04.007 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:04.007 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy testimg1 testimg4 2026-03-24T11:38:04.497 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T11:38:04.501 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy testimg1 --snap=snap1 testimg5 2026-03-24T11:38:05.538 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T11:38:05.542 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg4 2026-03-24T11:38:05.542 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 128 MiB' 2026-03-24T11:38:05.571 INFO:tasks.workunit.client.0.vm05.stdout: size 128 MiB in 32 objects 2026-03-24T11:38:05.571 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg5 2026-03-24T11:38:05.572 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:05.600 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:05.600 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg4 2026-03-24T11:38:05.600 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:38:05.601 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:38:05.601 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:38:05.625 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:38:05.626 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg4 2026-03-24T11:38:05.626 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '.*snap1.*' 2026-03-24T11:38:05.653 INFO:tasks.workunit.client.0.vm05.stdout: 15 snap1 256 MiB Tue Mar 24 11:38:04 2026 2026-03-24T11:38:05.653 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img1.new 2026-03-24T11:38:05.704 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:05.708 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg2 /tmp/img2.new 2026-03-24T11:38:05.927 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:38:05.933+0000 7fa22e214640 0 --2- 192.168.123.105:0/2343807009 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x55dd7f013b00 0x55dd7f008040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:38:05.983 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:05.988 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg3 /tmp/img3.new 2026-03-24T11:38:06.038 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.042 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.new 2026-03-24T11:38:06.087 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.093 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff2 /tmp/img-diff2.new 2026-03-24T11:38:06.163 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.169 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff3 /tmp/img-diff3.new 2026-03-24T11:38:06.225 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 3% complete... Exporting image: 6% complete... Exporting image: 9% complete... Exporting image: 12% complete... Exporting image: 15% complete... Exporting image: 18% complete... Exporting image: 21% complete... Exporting image: 25% complete... Exporting image: 28% complete... Exporting image: 31% complete... Exporting image: 34% complete... Exporting image: 37% complete... Exporting image: 40% complete... Exporting image: 43% complete... Exporting image: 46% complete... Exporting image: 50% complete... Exporting image: 53% complete... Exporting image: 56% complete... Exporting image: 59% complete... Exporting image: 62% complete... Exporting image: 65% complete... Exporting image: 68% complete... Exporting image: 71% complete... Exporting image: 75% complete... Exporting image: 78% complete... Exporting image: 81% complete... Exporting image: 84% complete... Exporting image: 87% complete... Exporting image: 90% complete... Exporting image: 93% complete... Exporting image: 96% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.231 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img2.new 2026-03-24T11:38:06.367 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img3 /tmp/img3.new 2026-03-24T11:38:06.434 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img-diff2.new 2026-03-24T11:38:06.539 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img3 /tmp/img-diff3.new 2026-03-24T11:38:06.592 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rollback --snap=snap1 testimg1 2026-03-24T11:38:06.641 INFO:tasks.workunit.client.0.vm05.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T11:38:06.648 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rollback --snap=snap1 testimg-diff1 2026-03-24T11:38:06.698 INFO:tasks.workunit.client.0.vm05.stderr: Rolling back to snapshot: 1% complete... Rolling back to snapshot: 3% complete... Rolling back to snapshot: 4% complete... Rolling back to snapshot: 6% complete... Rolling back to snapshot: 7% complete... Rolling back to snapshot: 9% complete... Rolling back to snapshot: 10% complete... Rolling back to snapshot: 12% complete... Rolling back to snapshot: 14% complete... Rolling back to snapshot: 15% complete... Rolling back to snapshot: 17% complete... Rolling back to snapshot: 18% complete... Rolling back to snapshot: 20% complete... Rolling back to snapshot: 21% complete... Rolling back to snapshot: 23% complete... Rolling back to snapshot: 25% complete... Rolling back to snapshot: 26% complete... Rolling back to snapshot: 28% complete... Rolling back to snapshot: 29% complete... Rolling back to snapshot: 31% complete... Rolling back to snapshot: 32% complete... Rolling back to snapshot: 34% complete... Rolling back to snapshot: 35% complete... Rolling back to snapshot: 37% complete... Rolling back to snapshot: 39% complete... Rolling back to snapshot: 40% complete... Rolling back to snapshot: 42% complete... Rolling back to snapshot: 43% complete... Rolling back to snapshot: 45% complete... Rolling back to snapshot: 46% complete... Rolling back to snapshot: 48% complete... Rolling back to snapshot: 50% complete... Rolling back to snapshot: 51% complete... Rolling back to snapshot: 53% complete... Rolling back to snapshot: 54% complete... Rolling back to snapshot: 56% complete... Rolling back to snapshot: 57% complete... Rolling back to snapshot: 59% complete... Rolling back to snapshot: 60% complete... Rolling back to snapshot: 62% complete... Rolling back to snapshot: 64% complete... Rolling back to snapshot: 65% complete... Rolling back to snapshot: 67% complete... Rolling back to snapshot: 68% complete... Rolling back to snapshot: 70% complete... Rolling back to snapshot: 71% complete... Rolling back to snapshot: 73% complete... Rolling back to snapshot: 75% complete... Rolling back to snapshot: 76% complete... Rolling back to snapshot: 78% complete... Rolling back to snapshot: 79% complete... Rolling back to snapshot: 81% complete... Rolling back to snapshot: 82% complete... Rolling back to snapshot: 84% complete... Rolling back to snapshot: 85% complete... Rolling back to snapshot: 87% complete... Rolling back to snapshot: 89% complete... Rolling back to snapshot: 90% complete... Rolling back to snapshot: 92% complete... Rolling back to snapshot: 93% complete... Rolling back to snapshot: 95% complete... Rolling back to snapshot: 96% complete... Rolling back to snapshot: 98% complete... Rolling back to snapshot: 100% complete...done. 2026-03-24T11:38:06.704 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg1 2026-03-24T11:38:06.704 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:06.729 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:06.729 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg-diff1 2026-03-24T11:38:06.729 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:38:06.757 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:38:06.757 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg1 /tmp/img1.snap1 2026-03-24T11:38:06.817 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.821 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd export testimg-diff1 /tmp/img-diff1.snap1 2026-03-24T11:38:06.881 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 12% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 23% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 37% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 48% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 62% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 73% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 87% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 98% complete... Exporting image: 100% complete...done. 2026-03-24T11:38:06.885 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img1.snap1 2026-03-24T11:38:06.993 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /tmp/img2 /tmp/img-diff1.snap1 2026-03-24T11:38:07.099 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg2 2026-03-24T11:38:07.167 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T11:38:07.169+0000 7f40d7591640 0 -- 192.168.123.105:0/2789839867 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f40b0008d30 msgr2=0x7f40b00291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:38:07.174 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:38:07.179 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg3 2026-03-24T11:38:07.244 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 6% complete... Removing image: 9% complete... Removing image: 12% complete... Removing image: 15% complete... Removing image: 18% complete... Removing image: 21% complete... Removing image: 25% complete... Removing image: 28% complete... Removing image: 31% complete... Removing image: 34% complete... Removing image: 37% complete... Removing image: 40% complete... Removing image: 43% complete... Removing image: 46% complete... Removing image: 50% complete... Removing image: 53% complete... Removing image: 56% complete... Removing image: 59% complete... Removing image: 62% complete... Removing image: 65% complete... Removing image: 68% complete... Removing image: 71% complete... Removing image: 75% complete... Removing image: 78% complete... Removing image: 81% complete... Removing image: 84% complete... Removing image: 87% complete... Removing image: 90% complete... Removing image: 93% complete... Removing image: 96% complete...2026-03-24T11:38:07.245+0000 7f30f0fdf640 0 -- 192.168.123.105:0/786299557 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55f0534f7150 msgr2=0x55f0534e6f80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:38:07.251 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:38:07.255 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create testimg2 -s 0 2026-03-24T11:38:07.276 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:38:07.277+0000 7f330674f200 -1 librbd: Forced V1 image creation. 2026-03-24T11:38:07.282 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd cp testimg2 testimg3 2026-03-24T11:38:07.314 INFO:tasks.workunit.client.0.vm05.stderr: Image copy: 100% complete...done. 2026-03-24T11:38:07.317 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep cp testimg2 testimg6 2026-03-24T11:38:07.352 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 100% complete...done. 2026-03-24T11:38:07.355 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm --snap=snap1 testimg1 2026-03-24T11:38:07.527 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:38:07.534 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm --snap=snap1 testimg-diff1 2026-03-24T11:38:08.530 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:38:08.538 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg1 2026-03-24T11:38:08.538 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T11:38:08.564 INFO:tasks.workunit.client.0.vm05.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T11:38:08.565 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --snap=snap1 testimg-diff1 2026-03-24T11:38:08.565 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'error setting snapshot context: (2) No such file or directory' 2026-03-24T11:38:08.589 INFO:tasks.workunit.client.0.vm05.stdout:error setting snapshot context: (2) No such file or directory 2026-03-24T11:38:08.589 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd sparsify testimg1 2026-03-24T11:38:08.627 INFO:tasks.workunit.client.0.vm05.stderr: Image sparsify: 1% complete... Image sparsify: 3% complete... Image sparsify: 4% complete... Image sparsify: 6% complete... Image sparsify: 7% complete... Image sparsify: 9% complete... Image sparsify: 10% complete... Image sparsify: 12% complete... Image sparsify: 14% complete... Image sparsify: 15% complete... Image sparsify: 17% complete... Image sparsify: 18% complete... Image sparsify: 20% complete... Image sparsify: 21% complete... Image sparsify: 23% complete... Image sparsify: 25% complete... Image sparsify: 26% complete... Image sparsify: 28% complete... Image sparsify: 29% complete... Image sparsify: 31% complete... Image sparsify: 32% complete... Image sparsify: 34% complete... Image sparsify: 35% complete... Image sparsify: 37% complete... Image sparsify: 39% complete... Image sparsify: 40% complete... Image sparsify: 42% complete... Image sparsify: 43% complete... Image sparsify: 45% complete... Image sparsify: 46% complete... Image sparsify: 48% complete... Image sparsify: 50% complete... Image sparsify: 51% complete... Image sparsify: 53% complete... Image sparsify: 54% complete... Image sparsify: 56% complete... Image sparsify: 57% complete... Image sparsify: 59% complete... Image sparsify: 60% complete... Image sparsify: 62% complete... Image sparsify: 64% complete... Image sparsify: 65% complete... Image sparsify: 67% complete... Image sparsify: 68% complete... Image sparsify: 70% complete... Image sparsify: 71% complete... Image sparsify: 73% complete... Image sparsify: 75% complete... Image sparsify: 76% complete... Image sparsify: 78% complete... Image sparsify: 79% complete... Image sparsify: 81% complete... Image sparsify: 82% complete... Image sparsify: 84% complete... Image sparsify: 85% complete... Image sparsify: 87% complete... Image sparsify: 89% complete... Image sparsify: 90% complete... Image sparsify: 92% complete... Image sparsify: 93% complete... Image sparsify: 95% complete... Image sparsify: 96% complete... Image sparsify: 98% complete... Image sparsify: 100% complete...done. 2026-03-24T11:38:08.633 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:38:08.633 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:08.767 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:08.887 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:09.051 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:09.856 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:10.870 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:38:10.966 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.077 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.206 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.322 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.391 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.455 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.515 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.582 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.649 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.718 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:11.784 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.053 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.121 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -f /tmp/img1 /tmp/img1.new /tmp/img2 /tmp/img2.new /tmp/img3 /tmp/img3.new /tmp/img-diff1.new /tmp/img-diff2.new /tmp/img-diff3.new /tmp/img1.snap1 /tmp/img1.snap1 /tmp/img-diff1.snap1 2026-03-24T11:53:12.250 INFO:tasks.workunit.client.0.vm05.stdout:testing locking... 2026-03-24T11:53:12.250 INFO:tasks.workunit.client.0.vm05.stderr:+ test_locking 2026-03-24T11:53:12.250 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing locking...' 2026-03-24T11:53:12.250 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:53:12.250 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.319 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.385 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.456 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.525 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.592 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.658 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.723 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.787 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.851 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.914 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:12.973 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.033 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.095 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.361 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.424 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.487 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.551 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:13.619 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T11:53:13.658 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:13.658 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:13.658 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:53:13.685 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:13.685 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id 2026-03-24T11:53:13.719 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:13.720 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 1 ' 2026-03-24T11:53:13.752 INFO:tasks.workunit.client.0.vm05.stdout:There is 1 exclusive lock on this image. 2026-03-24T11:53:13.752 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd lock list test1 2026-03-24T11:53:13.752 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{print $1;}' 2026-03-24T11:53:13.752 INFO:tasks.workunit.client.0.vm05.stderr:++ tail -n 1 2026-03-24T11:53:13.779 INFO:tasks.workunit.client.0.vm05.stderr:+ LOCKER=client.8382 2026-03-24T11:53:13.779 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock remove test1 id client.8382 2026-03-24T11:53:13.938 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:13.938 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:13.938 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:53:13.966 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:13.966 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T11:53:14.003 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:14.004 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 1 ' 2026-03-24T11:53:14.032 INFO:tasks.workunit.client.0.vm05.stdout:There is 1 shared lock on this image. 2026-03-24T11:53:14.032 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id --shared tag 2026-03-24T11:53:14.069 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:14.069 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 2 ' 2026-03-24T11:53:14.097 INFO:tasks.workunit.client.0.vm05.stdout:There are 2 shared locks on this image. 2026-03-24T11:53:14.097 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock add test1 id2 --shared tag 2026-03-24T11:53:14.132 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:14.132 INFO:tasks.workunit.client.0.vm05.stderr:+ grep ' 3 ' 2026-03-24T11:53:14.158 INFO:tasks.workunit.client.0.vm05.stdout:There are 3 shared locks on this image. 2026-03-24T11:53:14.158 INFO:tasks.workunit.client.0.vm05.stderr:+ tail -n 1 2026-03-24T11:53:14.159 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:14.159 INFO:tasks.workunit.client.0.vm05.stderr:+ awk '{print $2, $1;}' 2026-03-24T11:53:14.159 INFO:tasks.workunit.client.0.vm05.stderr:+ xargs rbd lock remove test1 2026-03-24T11:53:14.946 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info test1 2026-03-24T11:53:14.946 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -qE 'features:.*exclusive' 2026-03-24T11:53:14.985 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd lock list test1 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' -n 'There are 2 shared locks on this image. 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:Lock tag: tag 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:Locker ID Address 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:client.8396 id 192.168.123.105:0/4215727306 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:client.8402 id 192.168.123.105:0/4074493522' ']' 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:+ tail -n 1 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:+ awk '{print $2, $1;}' 2026-03-24T11:53:15.020 INFO:tasks.workunit.client.0.vm05.stderr:+ xargs rbd lock remove test1 2026-03-24T11:53:15.985 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd lock list test1 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' -n 'There is 1 shared lock on this image. 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:Lock tag: tag 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:Locker ID Address 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:client.8396 id 192.168.123.105:0/4215727306' ']' 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd lock list test1 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:+ tail -n 1 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:+ awk '{print $2, $1;}' 2026-03-24T11:53:16.014 INFO:tasks.workunit.client.0.vm05.stderr:+ xargs rbd lock remove test1 2026-03-24T11:53:16.991 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd lock list test1 2026-03-24T11:53:17.022 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' -n '' ']' 2026-03-24T11:53:17.022 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:53:17.102 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:17.106+0000 7f702bdf0640 0 -- 192.168.123.105:0/3066886092 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55799c907320 msgr2=0x55799c93c8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:17.102 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:17.107 INFO:tasks.workunit.client.0.vm05.stdout:testing clone... 2026-03-24T11:53:17.107 INFO:tasks.workunit.client.0.vm05.stderr:+ test_clone 2026-03-24T11:53:17.107 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing clone...' 2026-03-24T11:53:17.107 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:53:17.107 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.241 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.312 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.381 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.450 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.523 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.590 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.655 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.719 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.787 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.858 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:17.934 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.008 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.081 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.149 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.298 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:18.363 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create test1 --image-format 2 -s 1 2026-03-24T11:53:18.398 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@s1 2026-03-24T11:53:18.992 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:18.998 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect test1@s1 2026-03-24T11:53:19.030 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T11:53:20.048 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T11:53:20.061 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T11:53:23.014 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@s1 rbd2/clone 2026-03-24T11:53:23.233 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:23.238+0000 7f3aefc2e640 0 --2- 192.168.123.105:0/3315499739 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x55be10931ed0 0x55be109bd9a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:53:23.462 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p rbd2 ls 2026-03-24T11:53:23.462 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone 2026-03-24T11:53:23.488 INFO:tasks.workunit.client.0.vm05.stdout:clone 2026-03-24T11:53:23.488 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd -p rbd2 ls -l 2026-03-24T11:53:23.488 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone 2026-03-24T11:53:23.489 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1@s1 2026-03-24T11:53:23.526 INFO:tasks.workunit.client.0.vm05.stdout:clone 1 MiB rbd/test1@s1 2 2026-03-24T11:53:23.526 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd ls 2026-03-24T11:53:23.550 INFO:tasks.workunit.client.0.vm05.stderr:+ test test1 = test1 2026-03-24T11:53:23.550 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten rbd2/clone 2026-03-24T11:53:23.585 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 100% complete...done. 2026-03-24T11:53:23.594 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd2/clone@s1 2026-03-24T11:53:24.017 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:24.024 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect rbd2/clone@s1 2026-03-24T11:53:24.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/clone@s1 clone2 2026-03-24T11:53:24.113 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:24.113 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone2 2026-03-24T11:53:24.137 INFO:tasks.workunit.client.0.vm05.stdout:clone2 2026-03-24T11:53:24.137 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:24.138 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone2 2026-03-24T11:53:24.138 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/clone@s1 2026-03-24T11:53:24.174 INFO:tasks.workunit.client.0.vm05.stdout:clone2 1 MiB rbd2/clone@s1 2 2026-03-24T11:53:24.174 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd -p rbd2 ls 2026-03-24T11:53:24.200 INFO:tasks.workunit.client.0.vm05.stderr:+ test clone = clone 2026-03-24T11:53:24.200 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/clone clone3 2026-03-24T11:53:24.200 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'snapshot name was not specified' 2026-03-24T11:53:24.216 INFO:tasks.workunit.client.0.vm05.stdout:rbd: snapshot name was not specified 2026-03-24T11:53:24.216 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/clone@invalid clone3 2026-03-24T11:53:24.216 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'failed to open parent image' 2026-03-24T11:53:24.247 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24T11:53:24.246+0000 7fe12cff9640 -1 librbd::image::CloneRequest: 0x561c4b002ac0 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-24T11:53:24.247 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/clone --snap-id 0 clone3 2026-03-24T11:53:24.247 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'failed to open parent image' 2026-03-24T11:53:24.279 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24T11:53:24.278+0000 7ff168ff9640 -1 librbd::image::CloneRequest: 0x556c30d27a80 handle_open_parent: failed to open parent image: (2) No such file or directory 2026-03-24T11:53:24.279 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/clone@invalid --snap-id 0 clone3 2026-03-24T11:53:24.279 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'trying to access snapshot using both name and id' 2026-03-24T11:53:24.294 INFO:tasks.workunit.client.0.vm05.stdout:rbd: trying to access snapshot using both name and id. 2026-03-24T11:53:24.295 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd snap ls rbd2/clone --format json 2026-03-24T11:53:24.295 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.[] | select(.name == "s1") | .id' 2026-03-24T11:53:24.325 INFO:tasks.workunit.client.0.vm05.stderr:+ SNAP_ID=3 2026-03-24T11:53:24.325 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --snap-id 3 rbd2/clone clone3 2026-03-24T11:53:24.376 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:24.376 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone3 2026-03-24T11:53:24.400 INFO:tasks.workunit.client.0.vm05.stdout:clone3 2026-03-24T11:53:24.401 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:24.401 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone3 2026-03-24T11:53:24.401 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/clone@s1 2026-03-24T11:53:24.437 INFO:tasks.workunit.client.0.vm05.stdout:clone3 1 MiB rbd2/clone@s1 2 2026-03-24T11:53:24.438 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd -p rbd2 ls 2026-03-24T11:53:24.664 INFO:tasks.workunit.client.0.vm05.stderr:+ test clone = clone 2026-03-24T11:53:24.665 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd ls -l 2026-03-24T11:53:24.665 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c rbd2/clone@s1 2026-03-24T11:53:24.702 INFO:tasks.workunit.client.0.vm05.stderr:+ test 2 = 2 2026-03-24T11:53:24.702 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten clone3 2026-03-24T11:53:24.736 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 100% complete...done. 2026-03-24T11:53:24.743 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd ls -l 2026-03-24T11:53:24.743 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c rbd2/clone@s1 2026-03-24T11:53:24.776 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 = 1 2026-03-24T11:53:24.776 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm clone2 2026-03-24T11:53:24.849 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:24.850+0000 7f3704dc0640 0 -- 192.168.123.105:0/2344710236 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56404a8bd320 msgr2=0x56404a8f2390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:24.849 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:24.853 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect rbd2/clone@s1 2026-03-24T11:53:24.887 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm rbd2/clone@s1 2026-03-24T11:53:25.019 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:25.025 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd2/clone 2026-03-24T11:53:25.087 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:25.090+0000 7f997b842640 0 -- 192.168.123.105:0/137310463 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f9954008d30 msgr2=0x7f99540291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:25.087 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:25.091 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm clone3 2026-03-24T11:53:25.156 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:25.158+0000 7f449e96e640 0 -- 192.168.123.105:0/1251895721 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x563bfcde1320 msgr2=0x563bfce16390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:25.161 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:25.164 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect test1@s1 2026-03-24T11:53:25.199 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm test1@s1 2026-03-24T11:53:26.025 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:26.034 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:53:26.105 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:26.106+0000 7fdec4d27640 0 -- 192.168.123.105:0/1974964632 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5555dfa6a320 msgr2=0x5555dfa9f8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:26.109 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:26.113 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T11:53:27.080 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T11:53:27.096 INFO:tasks.workunit.client.0.vm05.stdout:testing trash... 2026-03-24T11:53:27.096 INFO:tasks.workunit.client.0.vm05.stderr:+ test_trash 2026-03-24T11:53:27.096 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing trash...' 2026-03-24T11:53:27.096 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:53:27.096 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.171 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.244 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.322 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.398 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.474 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.545 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.614 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.682 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.752 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.824 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:27.962 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.233 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.298 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.367 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.437 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.504 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:28.574 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T11:53:28.611 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test2 2026-03-24T11:53:28.648 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:28.648 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:53:28.866 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:28.870+0000 7fe3ab2a7640 0 --2- 192.168.123.105:0/1124230784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x5556bf25a8d0 0x5556bf2688f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:53:28.874 INFO:tasks.workunit.client.0.vm05.stdout:test1 2026-03-24T11:53:28.874 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:28.874 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T11:53:28.897 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T11:53:28.898 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:28.898 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:28.898 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:28.920 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:28.921 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:28.921 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*2.*' 2026-03-24T11:53:28.952 INFO:tasks.workunit.client.0.vm05.stdout:test1 1 MiB 2 2026-03-24T11:53:28.952 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:28.952 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*2.*' 2026-03-24T11:53:28.981 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 2 2026-03-24T11:53:28.982 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv test1 2026-03-24T11:53:29.032 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:29.032 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T11:53:29.054 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T11:53:29.055 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:29.055 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:29.055 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:29.080 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:29.080 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:29.080 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*2.*' 2026-03-24T11:53:29.109 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 2 2026-03-24T11:53:29.109 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:29.109 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:53:29.134 INFO:tasks.workunit.client.0.vm05.stdout:2237d24bffb8 test1 2026-03-24T11:53:29.134 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:29.134 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:29.134 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:29.156 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:29.157 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:29.157 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*USER.*' 2026-03-24T11:53:29.188 INFO:tasks.workunit.client.0.vm05.stdout:2237d24bffb8 test1 USER Tue Mar 24 11:53:29 2026 expired at Tue Mar 24 11:53:29 2026 2026-03-24T11:53:29.188 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:29.188 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v 'protected until' 2026-03-24T11:53:29.219 INFO:tasks.workunit.client.0.vm05.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-24T11:53:29.219 INFO:tasks.workunit.client.0.vm05.stdout:2237d24bffb8 test1 USER Tue Mar 24 11:53:29 2026 expired at Tue Mar 24 11:53:29 2026 2026-03-24T11:53:29.220 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:29.220 INFO:tasks.workunit.client.0.vm05.stderr:++ cut -d ' ' -f 1 2026-03-24T11:53:29.245 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=2237d24bffb8 2026-03-24T11:53:29.246 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm 2237d24bffb8 2026-03-24T11:53:29.294 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:29.297 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv test2 2026-03-24T11:53:29.347 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:29.347 INFO:tasks.workunit.client.0.vm05.stderr:++ cut -d ' ' -f 1 2026-03-24T11:53:29.371 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=223a97081f3c 2026-03-24T11:53:29.371 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info --image-id 223a97081f3c 2026-03-24T11:53:29.371 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd image '\''test2'\''' 2026-03-24T11:53:29.399 INFO:tasks.workunit.client.0.vm05.stdout:rbd image 'test2': 2026-03-24T11:53:29.400 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd children --image-id 223a97081f3c 2026-03-24T11:53:29.400 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:29.400 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:29.428 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:29.428 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash restore 223a97081f3c 2026-03-24T11:53:29.462 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:29.462 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T11:53:29.484 INFO:tasks.workunit.client.0.vm05.stdout:test2 2026-03-24T11:53:29.484 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:53:29.484 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:29.485 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:29.507 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:29.507 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls -l 2026-03-24T11:53:29.507 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*2.*' 2026-03-24T11:53:29.536 INFO:tasks.workunit.client.0.vm05.stdout:test2 1 MiB 2 2026-03-24T11:53:29.536 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv test2 --expires-at '3600 sec' 2026-03-24T11:53:29.582 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image test2 will expire at 2026-03-24T12:53:29.563787+0000 2026-03-24T11:53:29.585 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:29.585 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test2 2026-03-24T11:53:29.613 INFO:tasks.workunit.client.0.vm05.stdout:223a97081f3c test2 2026-03-24T11:53:29.613 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:29.613 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:29.613 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:29.637 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:29.637 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:29.637 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test2.*USER.*protected until' 2026-03-24T11:53:29.665 INFO:tasks.workunit.client.0.vm05.stdout:223a97081f3c test2 USER Tue Mar 24 11:53:29 2026 protected until Tue Mar 24 12:53:29 2026 2026-03-24T11:53:29.665 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm 223a97081f3c 2026-03-24T11:53:29.665 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'Deferment time has not expired' 2026-03-24T11:53:29.690 INFO:tasks.workunit.client.0.vm05.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-24T11:53:29.690 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm --image-id 223a97081f3c --force 2026-03-24T11:53:29.737 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:29.741 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T11:53:29.780 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@snap1 2026-03-24T11:53:30.605 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:30.611 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect test1@snap1 2026-03-24T11:53:30.645 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@snap1 clone 2026-03-24T11:53:30.686 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv test1 2026-03-24T11:53:30.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:30.735 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:53:30.760 INFO:tasks.workunit.client.0.vm05.stdout:228fe6f305a test1 2026-03-24T11:53:30.760 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:30.760 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:30.760 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:30.785 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:30.785 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:30.785 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*USER.*' 2026-03-24T11:53:30.817 INFO:tasks.workunit.client.0.vm05.stdout:228fe6f305a test1 USER Tue Mar 24 11:53:30 2026 expired at Tue Mar 24 11:53:30 2026 2026-03-24T11:53:30.817 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:30.818 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v 'protected until' 2026-03-24T11:53:30.848 INFO:tasks.workunit.client.0.vm05.stdout:ID NAME SOURCE DELETED_AT STATUS PARENT 2026-03-24T11:53:30.848 INFO:tasks.workunit.client.0.vm05.stdout:228fe6f305a test1 USER Tue Mar 24 11:53:30 2026 expired at Tue Mar 24 11:53:30 2026 2026-03-24T11:53:30.848 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:30.848 INFO:tasks.workunit.client.0.vm05.stderr:++ cut -d ' ' -f 1 2026-03-24T11:53:30.872 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=228fe6f305a 2026-03-24T11:53:30.873 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls --image-id 228fe6f305a 2026-03-24T11:53:30.873 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:53:30.873 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:30.873 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:30.903 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:30.903 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls --image-id 228fe6f305a 2026-03-24T11:53:30.903 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '.*snap1.*' 2026-03-24T11:53:30.934 INFO:tasks.workunit.client.0.vm05.stdout: 18 snap1 1 MiB yes Tue Mar 24 11:53:30 2026 2026-03-24T11:53:30.934 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd children --image-id 228fe6f305a 2026-03-24T11:53:30.934 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:30.934 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:30.968 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:30.968 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd children --image-id 228fe6f305a 2026-03-24T11:53:30.968 INFO:tasks.workunit.client.0.vm05.stderr:+ grep clone 2026-03-24T11:53:31.001 INFO:tasks.workunit.client.0.vm05.stdout:rbd/clone 2026-03-24T11:53:31.001 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm clone 2026-03-24T11:53:31.070 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:31.074+0000 7efefaaf2640 0 -- 192.168.123.105:0/3850255596 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55be22363320 msgr2=0x55be22341800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:31.073 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:31.077 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect --image-id 228fe6f305a --snap snap1 2026-03-24T11:53:31.114 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm --image-id 228fe6f305a --snap snap1 2026-03-24T11:53:31.608 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:31.615 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls --image-id 228fe6f305a 2026-03-24T11:53:31.615 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:53:31.616 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:31.616 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:31.642 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:31.643 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash restore 228fe6f305a 2026-03-24T11:53:31.670 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@snap1 2026-03-24T11:53:32.613 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:32.620 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@snap2 2026-03-24T11:53:33.619 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:33.626 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls --image-id 228fe6f305a 2026-03-24T11:53:33.626 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:53:33.626 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:33.627 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:33.655 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:33.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 228fe6f305a 2026-03-24T11:53:35.625 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 50% complete... Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:53:35.633 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls --image-id 228fe6f305a 2026-03-24T11:53:35.633 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:53:35.633 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:35.633 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:35.662 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:35.662 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_to_trash_on_remove=true --rbd_move_to_trash_on_remove_expire_seconds=3600 test1 2026-03-24T11:53:35.705 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:35.709 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:35.709 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:53:35.735 INFO:tasks.workunit.client.0.vm05.stdout:228fe6f305a test1 2026-03-24T11:53:35.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:35.735 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:35.735 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:35.762 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:35.762 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -l 2026-03-24T11:53:35.762 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'test1.*USER.*protected until' 2026-03-24T11:53:35.794 INFO:tasks.workunit.client.0.vm05.stdout:228fe6f305a test1 USER Tue Mar 24 11:53:35 2026 protected until Tue Mar 24 12:53:35 2026 2026-03-24T11:53:35.795 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm 228fe6f305a 2026-03-24T11:53:35.795 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'Deferment time has not expired' 2026-03-24T11:53:35.821 INFO:tasks.workunit.client.0.vm05.stdout:Deferment time has not expired, please use --force if you really want to remove the image 2026-03-24T11:53:35.821 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm --image-id 228fe6f305a --force 2026-03-24T11:53:35.871 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:53:35.875 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:53:35.875 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:35.943 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.008 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.073 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.139 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.206 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.271 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.340 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.417 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.485 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.551 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.618 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.683 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.752 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.817 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.882 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:36.950 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.017 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.083 INFO:tasks.workunit.client.0.vm05.stderr:+ test_purge 2026-03-24T11:53:37.083 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing trash purge...' 2026-03-24T11:53:37.083 INFO:tasks.workunit.client.0.vm05.stdout:testing trash purge... 2026-03-24T11:53:37.083 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:53:37.083 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.147 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.219 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.288 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.354 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.419 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.485 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.552 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.623 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.692 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.784 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.848 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.913 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:37.980 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:38.048 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:38.113 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:38.181 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:38.251 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:53:38.320 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.320 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.320 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:38.343 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:38.343 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:38.363 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete...done. 2026-03-24T11:53:38.366 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:38.405 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T11:53:38.439 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:38.484 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:38.527 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.527 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.527 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:38.549 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:38.549 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:38.616 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 50% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:38.620 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.620 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.620 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:38.644 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:38.644 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:38.679 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T11:53:38.713 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 --expires-at '1 hour' 2026-03-24T11:53:38.760 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image testimg1 will expire at 2026-03-24T12:53:38.744749+0000 2026-03-24T11:53:38.765 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 --expires-at '3 hours' 2026-03-24T11:53:38.815 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image testimg2 will expire at 2026-03-24T14:53:38.795330+0000 2026-03-24T11:53:38.819 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.819 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.819 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:38.843 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:38.843 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:38.866 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete...done. 2026-03-24T11:53:38.869 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.869 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.869 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:38.895 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:38.895 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge --expired-before 'now + 2 hours' 2026-03-24T11:53:38.947 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:38.951 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.952 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:38.952 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:38.979 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:38.980 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:38.980 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:53:39.007 INFO:tasks.workunit.client.0.vm05.stdout:23d38487e1c4 testimg2 2026-03-24T11:53:39.007 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge --expired-before 'now + 4 hours' 2026-03-24T11:53:39.057 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:39.062 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:39.063 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:39.063 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:39.089 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:39.089 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:39.130 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1@snap 2026-03-24T11:53:39.606 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:39.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T11:53:39.656 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:39.693 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:39.742 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:39.790 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:53:39.838 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:39.838 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:39.838 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:39.862 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:39.862 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:39.862 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:40.166 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:40.167 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:40.167 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:40.167 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:40.189 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:40.189 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:40.189 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:53:40.213 INFO:tasks.workunit.client.0.vm05.stdout:23f319d42959 testimg1 2026-03-24T11:53:40.214 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:40.214 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{ print $1 }' 2026-03-24T11:53:40.238 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=23f319d42959 2026-03-24T11:53:40.238 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 23f319d42959 2026-03-24T11:53:40.632 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:53:40.641 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:40.689 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:40.693 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:40.693 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:40.693 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:40.717 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:40.717 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:40.752 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T11:53:40.789 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap 2026-03-24T11:53:41.640 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:41.650 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:41.687 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:41.735 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:41.786 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:53:42.007 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:42.010+0000 7ff2df7ff640 0 --2- 192.168.123.105:0/2602428981 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x5608b8015310 0x5608b80d2730 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:53:42.039 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:42.039 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:42.039 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:42.065 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:42.066 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:42.066 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:42.171 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:42.171 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:42.171 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:42.171 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:42.197 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:42.197 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:42.198 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:53:42.227 INFO:tasks.workunit.client.0.vm05.stdout:2422271dbb48 testimg2 2026-03-24T11:53:42.228 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:42.228 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{ print $1 }' 2026-03-24T11:53:42.251 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=2422271dbb48 2026-03-24T11:53:42.251 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 2422271dbb48 2026-03-24T11:53:42.636 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:53:42.646 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:42.690 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:42.694 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:42.694 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:42.694 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:42.717 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:42.717 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:42.751 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg2 2026-03-24T11:53:42.785 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:42.822 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg3@snap 2026-03-24T11:53:43.647 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:43.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:43.703 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:43.750 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:53:43.796 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:43.796 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:43.796 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:43.820 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:43.820 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:43.820 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:43.947 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:43.947 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:43.948 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:43.948 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:43.972 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:43.972 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:43.972 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg3 2026-03-24T11:53:43.995 INFO:tasks.workunit.client.0.vm05.stdout:2450f32d1a0a testimg3 2026-03-24T11:53:43.996 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:43.996 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{ print $1 }' 2026-03-24T11:53:44.020 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=2450f32d1a0a 2026-03-24T11:53:44.020 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 2450f32d1a0a 2026-03-24T11:53:44.645 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:53:44.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:44.702 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:44.707 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:44.707 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:44.707 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:44.731 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:44.731 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:44.764 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1@snap 2026-03-24T11:53:45.655 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:45.664 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T11:53:45.710 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg1@snap 2026-03-24T11:53:45.742 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:45.749 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:45.783 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap 2026-03-24T11:53:46.833 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:46.841 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T11:53:46.899 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T11:53:47.154 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg2@snap 2026-03-24T11:53:47.190 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:47.198 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg4@snap 2026-03-24T11:53:47.837 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:47.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T11:53:47.906 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg4@snap 2026-03-24T11:53:47.954 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:47.964 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:48.020 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:48.072 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:53:48.129 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg4 2026-03-24T11:53:48.181 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.181 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:48.181 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:53:48.205 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:53:48.205 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:48.205 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:48.546 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:48.546 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.546 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:48.546 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:48.569 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:48.569 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.569 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:53:48.592 INFO:tasks.workunit.client.0.vm05.stdout:2475342e4947 testimg1 2026-03-24T11:53:48.592 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.592 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:53:48.617 INFO:tasks.workunit.client.0.vm05.stdout:247ba771fc25 testimg2 2026-03-24T11:53:48.617 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.617 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg4 2026-03-24T11:53:48.640 INFO:tasks.workunit.client.0.vm05.stdout:2486e6441c1a testimg4 2026-03-24T11:53:48.640 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg6 2026-03-24T11:53:48.684 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.684 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:48.684 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:53:48.707 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:53:48.708 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:48.708 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:48.926 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:48.926 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.927 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:48.927 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:53:48.949 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:53:48.949 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.949 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:53:48.970 INFO:tasks.workunit.client.0.vm05.stdout:2475342e4947 testimg1 2026-03-24T11:53:48.970 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:48.970 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:53:48.993 INFO:tasks.workunit.client.0.vm05.stdout:247ba771fc25 testimg2 2026-03-24T11:53:48.993 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg5 2026-03-24T11:53:49.040 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:49.040 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:49.040 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:49.065 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:49.066 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:49.134 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:49.138+0000 7f09dffff640 0 -- 192.168.123.105:0/359157185 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f09bc0044b0 msgr2=0x7f09bc0267e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:49.141 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:53:49.142+0000 7f09e5340640 0 -- 192.168.123.105:0/359157185 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f09c405c100 msgr2=0x7f09c407c500 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:53:50.643 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:50.647 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:50.647 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:50.647 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:50.670 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:50.671 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:50.708 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1@snap 2026-03-24T11:53:51.611 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:51.618 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T11:53:51.666 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg1@snap 2026-03-24T11:53:51.698 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:51.706 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:51.742 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg3@snap 2026-03-24T11:53:52.650 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:52.657 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap 2026-03-24T11:53:53.652 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:53.659 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T11:53:53.709 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T11:53:53.769 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg2@snap 2026-03-24T11:53:53.807 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:53.816 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg4@snap 2026-03-24T11:53:54.656 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:54.663 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T11:53:54.719 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg4@snap 2026-03-24T11:53:54.758 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:54.765 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg1 2026-03-24T11:53:54.812 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg2 2026-03-24T11:53:54.857 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:53:54.904 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg4 2026-03-24T11:53:54.961 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:54.961 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:54.961 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:53:54.984 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:53:54.984 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:54.984 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:55.087 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:55.087 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.087 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:55.087 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:53:55.111 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:53:55.111 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg6 2026-03-24T11:53:55.159 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.159 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:55.159 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 5 2026-03-24T11:53:55.181 INFO:tasks.workunit.client.0.vm05.stdout:5 2026-03-24T11:53:55.181 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:55.181 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:55.801 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:55.802 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.802 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:55.802 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:53:55.826 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:53:55.827 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.827 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:53:55.850 INFO:tasks.workunit.client.0.vm05.stdout:24d3161c6754 testimg1 2026-03-24T11:53:55.850 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.850 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:53:55.873 INFO:tasks.workunit.client.0.vm05.stdout:24d9ce3f8cab testimg2 2026-03-24T11:53:55.873 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.873 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg3 2026-03-24T11:53:55.896 INFO:tasks.workunit.client.0.vm05.stdout:24df9c03b451 testimg3 2026-03-24T11:53:55.896 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg5 2026-03-24T11:53:55.943 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:55.944 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:55.944 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:53:55.968 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:53:55.968 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:55.968 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:53:57.825 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:53:57.825 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:57.825 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:57.825 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:53:57.853 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:53:57.853 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:57.854 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg3 2026-03-24T11:53:57.877 INFO:tasks.workunit.client.0.vm05.stdout:24df9c03b451 testimg3 2026-03-24T11:53:57.878 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:53:57.878 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{ print $1 }' 2026-03-24T11:53:57.901 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=24df9c03b451 2026-03-24T11:53:57.901 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 24df9c03b451 2026-03-24T11:53:58.716 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:53:58.725 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:53:58.773 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:53:58.777 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:53:58.777 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:53:58.777 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:53:58.802 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:53:58.802 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:53:58.838 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1@snap 2026-03-24T11:53:59.610 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:53:59.619 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T11:53:59.671 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg1@snap 2026-03-24T11:53:59.706 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:53:59.715 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:53:59.752 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap 2026-03-24T11:54:00.726 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:00.733 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T11:54:00.778 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T11:54:00.827 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg2@snap 2026-03-24T11:54:00.864 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:00.873 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg4@snap 2026-03-24T11:54:01.734 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:01.744 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T11:54:01.810 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg4@snap 2026-03-24T11:54:01.869 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:01.880 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-24T11:54:01.942 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:01.947 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-24T11:54:02.006 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:02.011 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:54:02.060 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-24T11:54:02.120 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:02.124 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.124 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:02.124 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:54:02.147 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:54:02.148 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:02.148 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:54:02.303 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:54:02.303 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.304 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:02.304 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:54:02.329 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:54:02.329 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.329 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:54:02.352 INFO:tasks.workunit.client.0.vm05.stdout:253a58bcd4ed testimg1 2026-03-24T11:54:02.352 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.353 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:54:02.378 INFO:tasks.workunit.client.0.vm05.stdout:25407955de19 testimg2 2026-03-24T11:54:02.379 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.379 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg4 2026-03-24T11:54:02.403 INFO:tasks.workunit.client.0.vm05.stdout:254c91ee8db2 testimg4 2026-03-24T11:54:02.403 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg6 2026-03-24T11:54:02.452 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.453 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:02.453 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:54:02.476 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:54:02.476 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:02.477 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:54:02.800 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:54:02.800 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.800 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:02.800 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2 2026-03-24T11:54:02.823 INFO:tasks.workunit.client.0.vm05.stdout:2 2026-03-24T11:54:02.823 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.823 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:54:02.845 INFO:tasks.workunit.client.0.vm05.stdout:253a58bcd4ed testimg1 2026-03-24T11:54:02.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.845 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:54:02.867 INFO:tasks.workunit.client.0.vm05.stdout:25407955de19 testimg2 2026-03-24T11:54:02.867 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg5 2026-03-24T11:54:02.914 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:02.914 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:02.914 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:54:02.938 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:54:02.938 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:03.002 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:03.006+0000 7fddba3f5640 0 -- 192.168.123.105:0/643030962 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fdd90046730 msgr2=0x7fdd90066b10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:03.009 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:03.010+0000 7fddbb67e640 0 -- 192.168.123.105:0/643030962 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5567bb03eb20 msgr2=0x5567bb0705d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:04.780 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 33% complete... Removing images: 66% complete... Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:54:04.785 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:04.785 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:04.785 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:54:04.808 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:54:04.808 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg1 2026-03-24T11:54:04.844 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1@snap 2026-03-24T11:54:05.746 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:05.754 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg1@snap testimg2 2026-03-24T11:54:05.828 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg1@snap 2026-03-24T11:54:05.860 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:05.868 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 256 testimg3 2026-03-24T11:54:05.907 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg3@snap 2026-03-24T11:54:06.755 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:06.766 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap 2026-03-24T11:54:07.759 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:07.768 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg4 2026-03-24T11:54:07.821 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg2@snap testimg5 2026-03-24T11:54:07.875 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg2@snap 2026-03-24T11:54:07.914 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:07.922 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg4@snap 2026-03-24T11:54:08.764 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:08.773 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 testimg4@snap testimg6 2026-03-24T11:54:08.835 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm testimg4@snap 2026-03-24T11:54:08.923 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:08.935 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg1 2026-03-24T11:54:08.998 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:09.003 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg2 2026-03-24T11:54:09.073 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:09.077 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg3 2026-03-24T11:54:09.145 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm --rbd_move_parent_to_trash_on_remove=true testimg4 2026-03-24T11:54:09.211 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:09.215 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.215 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:09.215 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:54:09.240 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:54:09.241 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:09.241 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:54:09.338 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:54:09.338 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.338 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:09.338 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:54:09.361 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:54:09.361 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg6 2026-03-24T11:54:09.414 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.414 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:09.414 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 5 2026-03-24T11:54:09.437 INFO:tasks.workunit.client.0.vm05.stdout:5 2026-03-24T11:54:09.438 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:09.438 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:54:09.683 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:54:09.683 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.683 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:09.683 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 3 2026-03-24T11:54:09.709 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:54:09.709 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.709 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg1 2026-03-24T11:54:09.731 INFO:tasks.workunit.client.0.vm05.stdout:25975e94bb31 testimg1 2026-03-24T11:54:09.732 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.732 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg2 2026-03-24T11:54:09.755 INFO:tasks.workunit.client.0.vm05.stdout:259d8b9e2a65 testimg2 2026-03-24T11:54:09.756 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.756 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg3 2026-03-24T11:54:09.780 INFO:tasks.workunit.client.0.vm05.stdout:25a3fa1d4276 testimg3 2026-03-24T11:54:09.780 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv testimg5 2026-03-24T11:54:09.825 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:09.826 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:09.826 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 4 2026-03-24T11:54:09.849 INFO:tasks.workunit.client.0.vm05.stdout:4 2026-03-24T11:54:09.849 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:09.849 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'some expired images could not be removed' 2026-03-24T11:54:11.854 INFO:tasks.workunit.client.0.vm05.stdout:rbd: some expired images could not be removed 2026-03-24T11:54:11.854 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:11.854 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:11.854 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:54:11.878 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:54:11.878 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:11.878 INFO:tasks.workunit.client.0.vm05.stderr:+ grep testimg3 2026-03-24T11:54:11.901 INFO:tasks.workunit.client.0.vm05.stdout:25a3fa1d4276 testimg3 2026-03-24T11:54:11.902 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash ls 2026-03-24T11:54:11.902 INFO:tasks.workunit.client.0.vm05.stderr:++ awk '{ print $1 }' 2026-03-24T11:54:11.927 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=25a3fa1d4276 2026-03-24T11:54:11.927 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge --image-id 25a3fa1d4276 2026-03-24T11:54:12.805 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:54:12.815 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge 2026-03-24T11:54:12.860 INFO:tasks.workunit.client.0.vm05.stderr: Removing images: 100% complete... Removing images: 100% complete...done. 2026-03-24T11:54:12.864 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls 2026-03-24T11:54:12.865 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:12.865 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 0 2026-03-24T11:54:12.889 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:54:12.889 INFO:tasks.workunit.client.0.vm05.stderr:+ test_deep_copy_clone 2026-03-24T11:54:12.889 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing deep copy clone...' 2026-03-24T11:54:12.889 INFO:tasks.workunit.client.0.vm05.stdout:testing deep copy clone... 2026-03-24T11:54:12.889 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:54:12.890 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:12.959 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.024 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.090 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.156 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.221 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.287 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.356 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.423 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.490 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.557 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.624 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.690 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.760 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.830 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.898 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:13.970 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:14.053 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:14.125 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create testimg1 --image-format 2 --size 256 2026-03-24T11:54:14.345 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:14.346+0000 7f1b933b0640 0 --2- 192.168.123.105:0/3567606124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x55606f5df090 0x55606f527ad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:54:14.363 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg1 --snap=snap1 2026-03-24T11:54:14.811 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:14.820 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect testimg1@snap1 2026-03-24T11:54:14.856 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-24T11:54:14.905 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap2 2026-03-24T11:54:15.814 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:15.822 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy testimg2 testimg3 2026-03-24T11:54:15.887 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:15.890+0000 7fb4cbfff640 0 -- 192.168.123.105:0/2935066644 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fb4a40293d0 msgr2=0x7fb4a4029870 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:16.823 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete... Image deep copy: 3% complete... Image deep copy: 4% complete... Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete...2026-03-24T11:54:16.826+0000 7fb4d0ab3640 0 -- 192.168.123.105:0/2935066644 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7fb4a40047b0 msgr2=0x7fb4a4025100 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:16.824 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 100% complete...done. 2026-03-24T11:54:16.829 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:54:16.829 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:54:16.864 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:54:16.864 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:54:16.864 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: rbd/testimg1@snap1' 2026-03-24T11:54:16.898 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/testimg1@snap1 2026-03-24T11:54:16.898 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg3 2026-03-24T11:54:16.898 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:54:16.898 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:16.898 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:54:16.931 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:54:16.931 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg3 2026-03-24T11:54:16.931 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '.*snap2.*' 2026-03-24T11:54:16.961 INFO:tasks.workunit.client.0.vm05.stdout: 40 snap2 256 MiB Tue Mar 24 11:54:16 2026 2026-03-24T11:54:16.962 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg2 2026-03-24T11:54:16.962 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T11:54:16.995 INFO:tasks.workunit.client.0.vm05.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T11:54:16.995 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:54:16.995 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T11:54:17.029 INFO:tasks.workunit.client.0.vm05.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T11:54:17.029 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten testimg2 2026-03-24T11:54:17.067 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T11:54:17.074 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten testimg3 2026-03-24T11:54:17.112 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T11:54:17.119 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-24T11:54:17.155 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge testimg2 2026-03-24T11:54:18.025 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:54:18.032 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge testimg3 2026-03-24T11:54:19.029 INFO:tasks.workunit.client.0.vm05.stderr: Removing all snapshots: 100% complete... Removing all snapshots: 100% complete...done. 2026-03-24T11:54:19.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg2 2026-03-24T11:54:19.110 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T11:54:19.114+0000 7fdf8f65f640 0 -- 192.168.123.105:0/1882369379 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55baae5f4e30 msgr2=0x55baae62a4a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:19.110 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:19.114 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm testimg3 2026-03-24T11:54:19.183 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 12% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 23% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 37% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 48% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 62% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 73% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 87% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 98% complete...2026-03-24T11:54:19.186+0000 7fb0cec69640 0 -- 192.168.123.105:0/2365449781 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x55e5ab35efc0 msgr2=0x55e5ab37f440 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:19.189 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:19.193 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect testimg1@snap1 2026-03-24T11:54:19.228 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone testimg1@snap1 testimg2 2026-03-24T11:54:19.278 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create testimg2@snap2 2026-03-24T11:54:19.611 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:19.620 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd deep copy --flatten testimg2 testimg3 2026-03-24T11:54:20.698 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 1% complete...2026-03-24T11:54:20.702+0000 7f55aaa18640 0 -- 192.168.123.105:0/545058583 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f55880047d0 msgr2=0x7f5588025140 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:20.779 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 3% complete... Image deep copy: 4% complete...2026-03-24T11:54:20.782+0000 7f55abca1640 0 -- 192.168.123.105:0/545058583 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5579ddc14170 msgr2=0x5579ddd52fe0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:21.702 INFO:tasks.workunit.client.0.vm05.stderr: Image deep copy: 6% complete... Image deep copy: 7% complete... Image deep copy: 9% complete... Image deep copy: 10% complete... Image deep copy: 12% complete... Image deep copy: 14% complete... Image deep copy: 15% complete... Image deep copy: 17% complete... Image deep copy: 18% complete... Image deep copy: 20% complete... Image deep copy: 21% complete... Image deep copy: 23% complete... Image deep copy: 25% complete... Image deep copy: 26% complete... Image deep copy: 28% complete... Image deep copy: 29% complete... Image deep copy: 31% complete... Image deep copy: 32% complete... Image deep copy: 34% complete... Image deep copy: 35% complete... Image deep copy: 37% complete... Image deep copy: 39% complete... Image deep copy: 40% complete... Image deep copy: 42% complete... Image deep copy: 43% complete... Image deep copy: 45% complete... Image deep copy: 46% complete... Image deep copy: 48% complete... Image deep copy: 50% complete... Image deep copy: 51% complete... Image deep copy: 53% complete... Image deep copy: 54% complete... Image deep copy: 56% complete... Image deep copy: 57% complete... Image deep copy: 59% complete... Image deep copy: 60% complete... Image deep copy: 62% complete... Image deep copy: 64% complete... Image deep copy: 65% complete... Image deep copy: 67% complete... Image deep copy: 68% complete... Image deep copy: 70% complete... Image deep copy: 71% complete... Image deep copy: 73% complete... Image deep copy: 75% complete... Image deep copy: 76% complete... Image deep copy: 78% complete... Image deep copy: 79% complete... Image deep copy: 81% complete... Image deep copy: 82% complete... Image deep copy: 84% complete... Image deep copy: 85% complete... Image deep copy: 87% complete... Image deep copy: 89% complete... Image deep copy: 90% complete... Image deep copy: 92% complete... Image deep copy: 93% complete... Image deep copy: 95% complete... Image deep copy: 96% complete... Image deep copy: 98% complete... Image deep copy: 100% complete... Image deep copy: 100% complete...done. 2026-03-24T11:54:21.709 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:54:21.709 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'size 256 MiB' 2026-03-24T11:54:21.741 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:54:21.741 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg3 2026-03-24T11:54:21.741 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v parent: 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout:rbd image 'testimg3': 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: size 256 MiB in 64 objects 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: order 22 (4 MiB objects) 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: snapshot_count: 1 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: id: 26b0699369c5 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: block_name_prefix: rbd_data.26b0699369c5 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: format: 2 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: op_features: 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: flags: 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: create_timestamp: Tue Mar 24 11:54:19 2026 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: access_timestamp: Tue Mar 24 11:54:19 2026 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stdout: modify_timestamp: Tue Mar 24 11:54:19 2026 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg3 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -v SNAPID 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:21.771 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 1 2026-03-24T11:54:21.801 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:54:21.801 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap ls testimg3 2026-03-24T11:54:21.801 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '.*snap2.*' 2026-03-24T11:54:21.830 INFO:tasks.workunit.client.0.vm05.stdout: 42 snap2 256 MiB Tue Mar 24 11:54:20 2026 2026-03-24T11:54:21.831 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info testimg2 2026-03-24T11:54:21.831 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'features:.*deep-flatten' 2026-03-24T11:54:21.865 INFO:tasks.workunit.client.0.vm05.stdout: features: layering, exclusive-lock, object-map, fast-diff, deep-flatten 2026-03-24T11:54:21.865 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten testimg2 2026-03-24T11:54:21.903 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 1% complete... Image flatten: 3% complete... Image flatten: 4% complete... Image flatten: 6% complete... Image flatten: 7% complete... Image flatten: 9% complete... Image flatten: 10% complete... Image flatten: 12% complete... Image flatten: 14% complete... Image flatten: 15% complete... Image flatten: 17% complete... Image flatten: 18% complete... Image flatten: 20% complete... Image flatten: 21% complete... Image flatten: 23% complete... Image flatten: 25% complete... Image flatten: 26% complete... Image flatten: 28% complete... Image flatten: 29% complete... Image flatten: 31% complete... Image flatten: 32% complete... Image flatten: 34% complete... Image flatten: 35% complete... Image flatten: 37% complete... Image flatten: 39% complete... Image flatten: 40% complete... Image flatten: 42% complete... Image flatten: 43% complete... Image flatten: 45% complete... Image flatten: 46% complete... Image flatten: 48% complete... Image flatten: 50% complete... Image flatten: 51% complete... Image flatten: 53% complete... Image flatten: 54% complete... Image flatten: 56% complete... Image flatten: 57% complete... Image flatten: 59% complete... Image flatten: 60% complete... Image flatten: 62% complete... Image flatten: 64% complete... Image flatten: 65% complete... Image flatten: 67% complete... Image flatten: 68% complete... Image flatten: 70% complete... Image flatten: 71% complete... Image flatten: 73% complete... Image flatten: 75% complete... Image flatten: 76% complete... Image flatten: 78% complete... Image flatten: 79% complete... Image flatten: 81% complete... Image flatten: 82% complete... Image flatten: 84% complete... Image flatten: 85% complete... Image flatten: 87% complete... Image flatten: 89% complete... Image flatten: 90% complete... Image flatten: 92% complete... Image flatten: 93% complete... Image flatten: 95% complete... Image flatten: 96% complete... Image flatten: 98% complete... Image flatten: 100% complete...done. 2026-03-24T11:54:21.911 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect testimg1@snap1 2026-03-24T11:54:21.946 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:54:21.946 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:22.934 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:23.722 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.061 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.130 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.199 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.265 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.337 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.406 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.474 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.540 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.609 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.684 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.752 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.821 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.896 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:25.975 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.051 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.124 INFO:tasks.workunit.client.0.vm05.stdout:testing clone v2... 2026-03-24T11:54:26.124 INFO:tasks.workunit.client.0.vm05.stderr:+ test_clone_v2 2026-03-24T11:54:26.125 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing clone v2...' 2026-03-24T11:54:26.125 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:54:26.125 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.199 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.277 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.355 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.438 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.518 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.587 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.657 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.725 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.800 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.870 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:26.939 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.007 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.074 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.145 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.215 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.285 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.355 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:27.424 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T11:54:27.465 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@1 2026-03-24T11:54:27.643 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:27.651 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test2 2026-03-24T11:54:27.677 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:27.678+0000 7fe34b7fe640 -1 librbd::image::CloneRequest: 0x5645dc1d7b50 validate_parent: parent snapshot must be protected 2026-03-24T11:54:27.677 INFO:tasks.workunit.client.0.vm05.stderr:rbd: clone error: (22) Invalid argument 2026-03-24T11:54:27.680 INFO:tasks.workunit.client.0.vm05.stderr:+ true 2026-03-24T11:54:27.680 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 test1@1 test2 2026-03-24T11:54:27.737 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd snap ls test1 --format json 2026-03-24T11:54:27.737 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.[] | select(.name == "1") | .id' 2026-03-24T11:54:27.767 INFO:tasks.workunit.client.0.vm05.stderr:+ SNAP_ID=43 2026-03-24T11:54:27.767 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=2 --snap-id 43 test1 test3 2026-03-24T11:54:27.824 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect test1@1 2026-03-24T11:54:27.862 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format=1 test1@1 test4 2026-03-24T11:54:27.923 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd children test1@1 2026-03-24T11:54:27.923 INFO:tasks.workunit.client.0.vm05.stderr:+ sort 2026-03-24T11:54:27.923 INFO:tasks.workunit.client.0.vm05.stderr:+ tr '\n' ' ' 2026-03-24T11:54:27.923 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-24T11:54:27.959 INFO:tasks.workunit.client.0.vm05.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-24T11:54:27.959 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd children --descendants test1 2026-03-24T11:54:27.959 INFO:tasks.workunit.client.0.vm05.stderr:+ sort 2026-03-24T11:54:27.959 INFO:tasks.workunit.client.0.vm05.stderr:+ tr '\n' ' ' 2026-03-24T11:54:27.959 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -E 'test2.*test3.*test4' 2026-03-24T11:54:28.006 INFO:tasks.workunit.client.0.vm05.stdout:rbd/test2 rbd/test3 rbd/test4 2026-03-24T11:54:28.006 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove test4 2026-03-24T11:54:28.086 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:28.090+0000 7f6185d1b640 0 -- 192.168.123.105:0/2033790408 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5647bee95150 msgr2=0x5647bee783c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:54:28.092 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:28.096 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect test1@1 2026-03-24T11:54:28.179 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap remove test1@1 2026-03-24T11:54:28.213 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:28.223 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap list --all test1 2026-03-24T11:54:28.223 INFO:tasks.workunit.client.0.vm05.stderr:+ grep -E 'trash \(user 1\) *$' 2026-03-24T11:54:28.254 INFO:tasks.workunit.client.0.vm05.stdout: 43 302cbd8e-6004-4e0f-ae5b-199e7a16481a 1 MiB Tue Mar 24 11:54:27 2026 trash (user 1) 2026-03-24T11:54:28.254 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@2 2026-03-24T11:54:28.647 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:28.655 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:28.656 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'image has snapshots' 2026-03-24T11:54:28.699 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image has snapshots - these must be deleted with 'rbd snap purge' before the image can be removed. 2026-03-24T11:54:28.699 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm test1@2 2026-03-24T11:54:29.610 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:29.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:29.621 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'linked clones' 2026-03-24T11:54:29.667 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T11:54:29.667 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test3 2026-03-24T11:54:29.743 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:29.746+0000 7fd7d3fff640 0 -- 192.168.123.105:0/1605852449 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fd7b0006e90 msgr2=0x7fd7b000b3a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:54:29.952 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:29.956 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:29.956 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'linked clones' 2026-03-24T11:54:30.006 INFO:tasks.workunit.client.0.vm05.stdout:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T11:54:30.006 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd flatten test2 2026-03-24T11:54:30.668 INFO:tasks.workunit.client.0.vm05.stderr: Image flatten: 100% complete...done. 2026-03-24T11:54:30.679 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap list --all test1 2026-03-24T11:54:30.679 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:30.680 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:54:30.712 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:54:30.712 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:30.797 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:30.798+0000 7f556bfff640 0 -- 192.168.123.105:0/3086991352 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f55480053b0 msgr2=0x7f5548005820 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:31.006 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:31.011 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:54:31.103 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:31.106+0000 7f061a988640 0 -- 192.168.123.105:0/1444625793 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x561dfdfb7320 msgr2=0x561dfdfec8a0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:31.109 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:31.113 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 test1 2026-03-24T11:54:31.155 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@1 2026-03-24T11:54:31.668 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:31.677 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create test1@2 2026-03-24T11:54:32.670 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:54:32.679 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@1 test2 --rbd-default-clone-format 2 2026-03-24T11:54:32.729 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone test1@2 test3 --rbd-default-clone-format 2 2026-03-24T11:54:32.777 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm test1@1 2026-03-24T11:54:32.811 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:32.820 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm test1@2 2026-03-24T11:54:32.855 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:54:32.866 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd rm test1 2026-03-24T11:54:32.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:32.902 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:32.906+0000 7f72718f7200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-24T11:54:32.902 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:54:32.905 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T11:54:32.909 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:54:32.909 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 --rbd-move-parent-to-trash-on-remove=true 2026-03-24T11:54:32.975 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:32.979 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -a 2026-03-24T11:54:32.980 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:33.009 INFO:tasks.workunit.client.0.vm05.stdout:27e5b72e6be5 test1 2026-03-24T11:54:33.010 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test2 2026-03-24T11:54:33.114 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:33.118+0000 7f840773e640 0 -- 192.168.123.105:0/2099987992 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x563db9cc35b0 msgr2=0x563db9cb34b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:33.131 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:33.134+0000 7f840773e640 0 -- 192.168.123.105:0/2099987992 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f83e405bec0 msgr2=0x7f83e407c2c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:33.701 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:33.705 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -a 2026-03-24T11:54:33.705 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:33.732 INFO:tasks.workunit.client.0.vm05.stdout:27e5b72e6be5 test1 2026-03-24T11:54:33.733 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test3 2026-03-24T11:54:33.819 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:33.822+0000 7ff8944d3640 0 -- 192.168.123.105:0/2203862987 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55e799e37320 msgr2=0x55e799e6c390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:33.832 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:33.834+0000 7ff8944d3640 0 -- 192.168.123.105:0/2203862987 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7ff86c008d30 msgr2=0x7ff87409d1b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:34.706 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:34.710 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep test1 2026-03-24T11:54:34.710 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls -a 2026-03-24T11:54:34.710 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:34.733 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:54:34.733 INFO:tasks.workunit.client.0.vm05.stdout:testing thick provision... 2026-03-24T11:54:34.733 INFO:tasks.workunit.client.0.vm05.stderr:+ test_thick_provision 2026-03-24T11:54:34.733 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing thick provision...' 2026-03-24T11:54:34.733 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:54:34.734 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.005 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.073 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.140 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.208 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.276 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.343 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.411 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.483 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.613 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.690 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.759 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:35.833 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.125 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.209 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.295 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.373 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.454 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:36.534 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --thick-provision -s 64M test1 2026-03-24T11:54:36.846 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 6% complete... Thick provisioning: 12% complete... Thick provisioning: 18% complete... Thick provisioning: 25% complete... Thick provisioning: 31% complete... Thick provisioning: 37% complete... Thick provisioning: 43% complete... Thick provisioning: 50% complete... Thick provisioning: 56% complete... Thick provisioning: 62% complete... Thick provisioning: 68% complete... Thick provisioning: 75% complete... Thick provisioning: 81% complete... Thick provisioning: 87% complete... Thick provisioning: 93% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ count=0 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ ret= 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ tr -s ' ' 2026-03-24T11:54:36.857 INFO:tasks.workunit.client.0.vm05.stderr:+ cut -d ' ' -f 4-5 2026-03-24T11:54:36.859 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^64 MiB' 2026-03-24T11:54:36.890 INFO:tasks.workunit.client.0.vm05.stdout:64 MiB 2026-03-24T11:54:36.890 INFO:tasks.workunit.client.0.vm05.stderr:+ ret=0 2026-03-24T11:54:36.890 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 = 0 ']' 2026-03-24T11:54:36.890 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:54:36.890 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:54:36.916 INFO:tasks.workunit.client.0.vm05.stdout:NAME PROVISIONED USED 2026-03-24T11:54:36.916 INFO:tasks.workunit.client.0.vm05.stdout:test1 64 MiB 64 MiB 2026-03-24T11:54:36.919 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T11:54:36.919 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:37.023 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 6% complete... Removing image: 12% complete... Removing image: 18% complete... Removing image: 25% complete... Removing image: 31% complete... Removing image: 37% complete... Removing image: 43% complete... Removing image: 50% complete... Removing image: 56% complete... Removing image: 62% complete... Removing image: 68% complete... Removing image: 75% complete... Removing image: 81% complete... Removing image: 87% complete... Removing image: 93% complete...2026-03-24T11:54:37.026+0000 7f3f4e0b3640 0 -- 192.168.123.105:0/1812743307 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f3f28012f20 msgr2=0x7f3f28013390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:37.234 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:37.238+0000 7f3f4fb3d640 0 -- 192.168.123.105:0/1812743307 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5646483b5320 msgr2=0x5646483ea390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:37.242 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:54:37.246 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:54:37.246 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:37.246 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:37.246 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:54:37.270 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:54:37.270 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --thick-provision -s 4G test1 2026-03-24T11:54:37.849 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 1% complete... Thick provisioning: 2% complete... Thick provisioning: 3% complete... Thick provisioning: 4% complete... Thick provisioning: 5% complete...2026-03-24T11:54:37.850+0000 7fb1ea515640 0 -- 192.168.123.105:0/2960598490 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560118fd5a70 msgr2=0x560118f1bc10 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:54:38.015 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 6% complete...2026-03-24T11:54:38.018+0000 7fb1e928c640 0 -- 192.168.123.105:0/2960598490 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x560119067350 msgr2=0x5601190877d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:57.234 INFO:tasks.workunit.client.0.vm05.stderr: Thick provisioning: 7% complete... Thick provisioning: 8% complete... Thick provisioning: 9% complete... Thick provisioning: 10% complete... Thick provisioning: 11% complete... Thick provisioning: 12% complete... Thick provisioning: 13% complete... Thick provisioning: 14% complete... Thick provisioning: 15% complete... Thick provisioning: 16% complete... Thick provisioning: 17% complete... Thick provisioning: 18% complete... Thick provisioning: 19% complete... Thick provisioning: 20% complete... Thick provisioning: 21% complete... Thick provisioning: 22% complete... Thick provisioning: 23% complete... Thick provisioning: 24% complete... Thick provisioning: 25% complete... Thick provisioning: 26% complete... Thick provisioning: 27% complete... Thick provisioning: 28% complete... Thick provisioning: 29% complete... Thick provisioning: 30% complete... Thick provisioning: 31% complete... Thick provisioning: 32% complete... Thick provisioning: 33% complete... Thick provisioning: 34% complete... Thick provisioning: 35% complete... Thick provisioning: 36% complete... Thick provisioning: 37% complete... Thick provisioning: 38% complete... Thick provisioning: 39% complete... Thick provisioning: 40% complete... Thick provisioning: 41% complete... Thick provisioning: 42% complete... Thick provisioning: 43% complete... Thick provisioning: 44% complete... Thick provisioning: 45% complete... Thick provisioning: 46% complete... Thick provisioning: 47% complete... Thick provisioning: 48% complete... Thick provisioning: 49% complete... Thick provisioning: 50% complete... Thick provisioning: 51% complete... Thick provisioning: 52% complete... Thick provisioning: 53% complete... Thick provisioning: 54% complete... Thick provisioning: 55% complete... Thick provisioning: 56% complete... Thick provisioning: 57% complete... Thick provisioning: 58% complete... Thick provisioning: 59% complete... Thick provisioning: 60% complete... Thick provisioning: 61% complete... Thick provisioning: 62% complete... Thick provisioning: 63% complete... Thick provisioning: 64% complete... Thick provisioning: 65% complete... Thick provisioning: 66% complete... Thick provisioning: 67% complete... Thick provisioning: 68% complete... Thick provisioning: 69% complete... Thick provisioning: 70% complete... Thick provisioning: 71% complete... Thick provisioning: 72% complete... Thick provisioning: 73% complete... Thick provisioning: 74% complete... Thick provisioning: 75% complete... Thick provisioning: 76% complete... Thick provisioning: 77% complete... Thick provisioning: 78% complete... Thick provisioning: 79% complete... Thick provisioning: 80% complete... Thick provisioning: 81% complete... Thick provisioning: 82% complete... Thick provisioning: 83% complete... Thick provisioning: 84% complete... Thick provisioning: 85% complete... Thick provisioning: 86% complete... Thick provisioning: 87% complete... Thick provisioning: 88% complete... Thick provisioning: 89% complete... Thick provisioning: 90% complete... Thick provisioning: 91% complete... Thick provisioning: 92% complete... Thick provisioning: 93% complete... Thick provisioning: 94% complete... Thick provisioning: 95% complete... Thick provisioning: 96% complete... Thick provisioning: 97% complete... Thick provisioning: 98% complete... Thick provisioning: 99% complete... Thick provisioning: 100% complete... Thick provisioning: 100% complete...done. 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ count=0 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ ret= 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 -lt 10 ']' 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ cut -d ' ' -f 4-5 2026-03-24T11:54:57.245 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^4 GiB' 2026-03-24T11:54:57.246 INFO:tasks.workunit.client.0.vm05.stderr:+ tr -s ' ' 2026-03-24T11:54:57.279 INFO:tasks.workunit.client.0.vm05.stdout:4 GiB 2026-03-24T11:54:57.280 INFO:tasks.workunit.client.0.vm05.stderr:+ ret=0 2026-03-24T11:54:57.280 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 = 0 ']' 2026-03-24T11:54:57.280 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:54:57.280 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd du 2026-03-24T11:54:57.305 INFO:tasks.workunit.client.0.vm05.stdout:NAME PROVISIONED USED 2026-03-24T11:54:57.305 INFO:tasks.workunit.client.0.vm05.stdout:test1 4 GiB 4 GiB 2026-03-24T11:54:57.308 INFO:tasks.workunit.client.0.vm05.stderr:+ '[' 0 '!=' 0 ']' 2026-03-24T11:54:57.308 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm test1 2026-03-24T11:54:57.375 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:54:57.378+0000 7f301d69d640 0 -- 192.168.123.105:0/2871870522 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f2ffc05bf40 msgr2=0x7f2ffc07c340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:54:57.434 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete...2026-03-24T11:54:57.438+0000 7f301f127640 0 -- 192.168.123.105:0/2871870522 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x556cb678c320 msgr2=0x556cb67c1390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:54:59.813 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T11:54:59.817 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd ls 2026-03-24T11:54:59.817 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test1 2026-03-24T11:54:59.817 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:54:59.817 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:54:59.842 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:54:59.843 INFO:tasks.workunit.client.0.vm05.stdout:testing namespace... 2026-03-24T11:54:59.843 INFO:tasks.workunit.client.0.vm05.stderr:+ test_namespace 2026-03-24T11:54:59.843 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing namespace...' 2026-03-24T11:54:59.843 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:54:59.843 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:59.919 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:54:59.990 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.069 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.145 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.215 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.287 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.361 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.431 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.505 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.579 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.655 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.728 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.802 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.877 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:00.950 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:01.023 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:01.097 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:01.170 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace ls 2026-03-24T11:55:01.170 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:55:01.170 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:55:01.195 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:55:01.196 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd/test1 2026-03-24T11:55:01.228 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create --pool rbd --namespace test2 2026-03-24T11:55:01.261 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create --namespace test3 2026-03-24T11:55:01.292 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd namespace create rbd/test3 2026-03-24T11:55:01.293 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd/test3 2026-03-24T11:55:01.316 INFO:tasks.workunit.client.0.vm05.stderr:rbd: failed to created namespace: 2026-03-24T11:55:01.318+0000 7f828f39e200 -1 librbd::api::Namespace: create: failed to add namespace: (17) File exists 2026-03-24T11:55:01.316 INFO:tasks.workunit.client.0.vm05.stderr:(17) File exists 2026-03-24T11:55:01.320 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:01.320 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace list 2026-03-24T11:55:01.320 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test 2026-03-24T11:55:01.320 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:55:01.320 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^3$' 2026-03-24T11:55:01.343 INFO:tasks.workunit.client.0.vm05.stdout:3 2026-03-24T11:55:01.343 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd namespace remove --pool rbd missing 2026-03-24T11:55:01.343 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace remove --pool rbd missing 2026-03-24T11:55:01.358 INFO:tasks.workunit.client.0.vm05.stderr:rbd: namespace name was not specified 2026-03-24T11:55:01.360 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:01.360 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image1 2026-03-24T11:55:01.397 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/test1/image1 2026-03-24T11:55:01.483 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-24T11:55:01.633 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:01.638+0000 7fb92e8b5640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fb90c0088e0 msgr2=0x7fb90c009b40 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:02.220 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:02.222+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x555a89b495e0 msgr2=0x555a89b39f90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:02.224 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:02.226+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb91005bf40 msgr2=0x7fb91007c340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:02.503 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:55:02.503 INFO:tasks.workunit.client.0.vm05.stdout: 1 2832 2803.14 11 MiB/s 2026-03-24T11:55:03.629 INFO:tasks.workunit.client.0.vm05.stdout: 2 4832 2261.18 8.8 MiB/s 2026-03-24T11:55:04.297 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:04.298+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb91005bf40 msgr2=0x7fb91007c340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:04.481 INFO:tasks.workunit.client.0.vm05.stdout: 3 6144 2056.07 8.0 MiB/s 2026-03-24T11:55:05.360 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:05.362+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x555a89b495e0 msgr2=0x7fb910093860 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:05.491 INFO:tasks.workunit.client.0.vm05.stdout: 4 6368 1594.4 6.2 MiB/s 2026-03-24T11:55:06.107 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:06.110+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb91005bf40 msgr2=0x7fb91007c340 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:06.725 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:06.730+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb908008d30 msgr2=0x7fb910082460 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:06.814 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:06.814+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb91005bf40 msgr2=0x7fb91007c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:07.040 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:07.042+0000 7fb93033f640 0 -- 192.168.123.105:0/3800123905 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fb91005bf40 msgr2=0x7fb91007c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:07.628 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 6 ops: 8192 ops/sec: 1333.33 bytes/sec: 5.2 MiB/s 2026-03-24T11:55:07.638 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd/test1/image1@1 2026-03-24T11:55:07.719 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:55:07.728 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/test1/image1@1 rbd/test2/image1 2026-03-24T11:55:07.782 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm rbd/test1/image1@1 2026-03-24T11:55:07.817 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:55:07.824 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export rbd/test1/image1 - 2026-03-24T11:55:07.825 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-24T11:55:07.825 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export rbd/test2/image1 - 2026-03-24T11:55:08.013 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-24T11:55:08.018+0000 7f85b2da9640 0 -- 192.168.123.105:0/3779544866 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x564324e06ca0 msgr2=0x564324e3e3e0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:08.071 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-24T11:55:08.074+0000 7f85b2da9640 0 -- 192.168.123.105:0/3779544866 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f859005bdd0 msgr2=0x7f859007c1d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:08.246 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 26% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 27% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 30% complete...2026-03-24T11:55:08.250+0000 7f0f6c3d8640 0 -- 192.168.123.105:0/3186379728 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x557ed9ca4c60 msgr2=0x557ed9cda830 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:08.302 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 31% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete...2026-03-24T11:55:08.306+0000 7f0f6c3d8640 0 -- 192.168.123.105:0/3186379728 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f0f4c05bf20 msgr2=0x7f0f4c07c320 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:09.167 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. 2026-03-24T11:55:09.168 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 100% complete...done. 2026-03-24T11:55:09.178 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/test2/image1 2026-03-24T11:55:09.380 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T11:55:09.382+0000 7f8b63911640 0 -- 192.168.123.105:0/1578755096 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x561cd4ae2b60 msgr2=0x561cd4b02fe0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:09.388 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:09.390+0000 7f8b63110640 0 -- 192.168.123.105:0/1578755096 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f8b3c008d30 msgr2=0x7f8b3c0291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:10.463 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:55:10.467 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G rbd/image2 2026-03-24T11:55:10.506 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 32M --io-size 4K rbd/image2 2026-03-24T11:55:10.543 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 4096 io_threads 16 bytes 33554432 pattern random 2026-03-24T11:55:11.052 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:11.054+0000 7f9054c32640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f90300049e0 msgr2=0x7f9030024dc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:11.761 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:11.762+0000 7f904ffff640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f903002a770 msgr2=0x7f903002ab80 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:11.859 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T11:55:11.859 INFO:tasks.workunit.client.0.vm05.stdout: 1 3744 2865.84 11 MiB/s 2026-03-24T11:55:11.917 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:11.918+0000 7f9055ebb640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5564d5c788a0 msgr2=0x5564d5d07d70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:11.919 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:11.922+0000 7f9055ebb640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f903002a770 msgr2=0x7f903409d7b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:12.757 INFO:tasks.workunit.client.0.vm05.stdout: 2 3776 1717.38 6.7 MiB/s 2026-03-24T11:55:13.543 INFO:tasks.workunit.client.0.vm05.stdout: 3 4496 1506 5.9 MiB/s 2026-03-24T11:55:13.558 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:13.562+0000 7f9055ebb640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f90300049e0 msgr2=0x7f903409d270 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:13.782 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:13.786+0000 7f9054c32640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x7f903405c090 msgr2=0x7f903407c490 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:14.378 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:14.382+0000 7f9055ebb640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f903405c090 msgr2=0x7f903407c470 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:14.423 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:14.426+0000 7f9055ebb640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x5564d5c788a0 msgr2=0x7f903409dcf0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:14.545 INFO:tasks.workunit.client.0.vm05.stdout: 4 6816 1709.7 6.7 MiB/s 2026-03-24T11:55:15.223 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:15.226+0000 7f904ffff640 0 -- 192.168.123.105:0/4114038865 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f903405c090 msgr2=0x7f903407c470 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:17.112 INFO:tasks.workunit.client.0.vm05.stdout: 6 7776 1187.08 4.6 MiB/s 2026-03-24T11:55:18.547 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 8 ops: 8192 ops/sec: 1023.48 bytes/sec: 4.0 MiB/s 2026-03-24T11:55:18.563 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd/image2@1 2026-03-24T11:55:18.640 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:55:18.653 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format 2 rbd/image2@1 rbd/test2/image2 2026-03-24T11:55:18.717 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm rbd/image2@1 2026-03-24T11:55:18.753 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:55:18.763 INFO:tasks.workunit.client.0.vm05.stderr:+ cmp /dev/fd/63 /dev/fd/62 2026-03-24T11:55:18.763 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export rbd/image2 - 2026-03-24T11:55:18.763 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd export rbd/test2/image2 - 2026-03-24T11:55:18.999 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 1% complete... Exporting image: 2% complete... Exporting image: 3% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 4% complete... Exporting image: 5% complete... Exporting image: 6% complete... Exporting image: 7% complete... Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: Exporting image: 8% complete... Exporting image: 9% complete... Exporting image: 10% complete... Exporting image: 11% complete...10% complete... Exporting image: 11% complete... Exporting image: 12% complete... Exporting image: 12% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete... Exporting image: 13% complete... Exporting image: 14% complete... Exporting image: 15% complete...2026-03-24T11:55:19.002+0000 7f5d424d2640 0 -- 192.168.123.105:0/3870423261 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f5d1c04f7b0 msgr2=0x7f5d1c06fb90 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:19.058 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete... Exporting image: 16% complete... Exporting image: 17% complete... Exporting image: 18% complete... Exporting image: 19% complete...2026-03-24T11:55:19.062+0000 7f5d424d2640 0 -- 192.168.123.105:0/3870423261 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f5d2405c010 msgr2=0x7f5d2407c410 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:19.297 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 20% complete... Exporting image: 21% complete... Exporting image: 22% complete... Exporting image: 23% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting image: 24% complete... Exporting image: 25% complete... Exporting image: 26% complete... Exporting imageExporting image: : 2727% complete...% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 28% complete... Exporting image: 29% complete... Exporting image: 30% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete... Exporting image: 31% complete... Exporting image: 32% complete... Exporting image: 33% complete... Exporting image: 34% complete...2026-03-24T11:55:19.302+0000 7fac8e95b640 0 -- 192.168.123.105:0/2924137997 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7fac6c003b40 msgr2=0x7fac6c045a30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:19.371 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete... Exporting image: 35% complete... Exporting image: 36% complete... Exporting image: 37% complete... Exporting image: 38% complete...2026-03-24T11:55:19.374+0000 7fac903e5640 0 -- 192.168.123.105:0/2924137997 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560883c1ef30 msgr2=0x560883cb39b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:20.338 INFO:tasks.workunit.client.0.vm05.stderr: Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 39% complete... Exporting image: 40% complete... Exporting image: 41% complete... Exporting image: 42% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 43% complete... Exporting image: 44% complete... Exporting image: 45% complete... Exporting image: 46% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 47% complete... Exporting image: 48% complete... Exporting image: 49% complete... Exporting image: 50% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 51% complete... Exporting image: 52% complete... Exporting image: 53% complete... Exporting image: 54% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 55% complete... Exporting image: 56% complete... Exporting image: 57% complete... Exporting image: 58% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 59% complete... Exporting image: 60% complete... Exporting image: 61% complete... Exporting image: 62% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 63% complete... Exporting image: 64% complete... Exporting image: 65% complete... Exporting image: 66% complete... Exporting image: 67% complete... Exporting image: 68% complete... Exporting image: 67% complete...Exporting image: 69% complete...Exporting image: 68% complete... Exporting image: 69% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 70% complete... Exporting image: 71% complete... Exporting image: 72% complete... Exporting image: 73% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 74% complete... Exporting image: 75% complete... Exporting image: 76% complete... Exporting image: 77% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 78% complete... Exporting image: 79% complete... Exporting image: 80% complete... Exporting image: 81% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 82% complete... Exporting image: 83% complete... Exporting image: 84% complete... Exporting image: 85% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 86% complete... Exporting image: 87% complete... Exporting image: 88% complete... Exporting image: 89% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 90% complete... Exporting image: 91% complete... Exporting image: 92% complete... Exporting image: 92% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 93% complete... Exporting image: 94% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 95% complete... Exporting image: 96% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 97% complete... Exporting image: 98% complete... Exporting image: 99% complete... Exporting image: 99% complete... Exporting image: 100% complete...done. Exporting image: 100% complete...done. 2026-03-24T11:55:20.338 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T11:55:20.346 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd rm rbd/image2 2026-03-24T11:55:20.346 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/image2 2026-03-24T11:55:20.387 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:20.390+0000 7f82b5940200 -1 librbd::api::Image: remove: image has snapshots - not removing 2026-03-24T11:55:20.387 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 0% complete...failed. 2026-03-24T11:55:20.391 INFO:tasks.workunit.client.0.vm05.stderr:rbd: image has snapshots with linked clones - these must be deleted or flattened before the image can be removed. 2026-03-24T11:55:20.394 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:20.394 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/test2/image2 2026-03-24T11:55:20.479 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T11:55:20.482+0000 7fc0ca432640 0 -- 192.168.123.105:0/4113274995 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fc0a8005fb0 msgr2=0x7fc0a8026390 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:20.489 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:20.494+0000 7fc0ca432640 0 -- 192.168.123.105:0/4113274995 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x555a7a1072d0 msgr2=0x7fc0ac09d110 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:20.698 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:55:20.702 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/image2 2026-03-24T11:55:20.812 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete...2026-03-24T11:55:20.814+0000 7fa9ba229640 0 -- 192.168.123.105:0/25697645 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x560f063a6360 msgr2=0x560f063d9fc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:20.849 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete...2026-03-24T11:55:20.850+0000 7fa9ba229640 0 -- 192.168.123.105:0/25697645 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fa99805bf70 msgr2=0x7fa99807c370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:21.214 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T11:55:21.217 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G rbd/test1/image3 2026-03-24T11:55:21.259 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd/test1/image3@1 2026-03-24T11:55:21.678 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T11:55:21.687 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect rbd/test1/image3@1 2026-03-24T11:55:21.720 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone --rbd-default-clone-format 1 rbd/test1/image3@1 rbd/test1/image4 2026-03-24T11:55:21.775 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/test1/image4 2026-03-24T11:55:21.995 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:55:21.998+0000 7fbd66d9e640 0 --2- 192.168.123.105:0/4229996135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x55d79612d2d0 0x55d796075e90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:55:22.064 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T11:55:22.066+0000 7fbd68027640 0 -- 192.168.123.105:0/4229996135 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55d7961514e0 msgr2=0x55d796129c70 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:22.080 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:55:22.084 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect rbd/test1/image3@1 2026-03-24T11:55:22.123 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap rm rbd/test1/image3@1 2026-03-24T11:55:22.676 INFO:tasks.workunit.client.0.vm05.stderr: Removing snap: 100% complete...done. 2026-03-24T11:55:22.688 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd/test1/image3 2026-03-24T11:55:22.775 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T11:55:22.778+0000 7f7451c69640 0 -- 192.168.123.105:0/2843789803 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f7434008d30 msgr2=0x7f74340291b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:22.789 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:55:22.793 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G --namespace test1 image2 2026-03-24T11:55:22.832 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd namespace remove rbd/test1 2026-03-24T11:55:22.832 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace remove rbd/test1 2026-03-24T11:55:22.864 INFO:tasks.workunit.client.0.vm05.stderr:rbd: namespace contains images which must be deleted first. 2026-03-24T11:55:22.868 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:22.868 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group create rbd/test1/group1 2026-03-24T11:55:22.902 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group image add rbd/test1/group1 rbd/test1/image1 2026-03-24T11:55:22.943 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group image add --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image2 2026-03-24T11:55:22.989 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group image rm --group-pool rbd --group-namespace test1 --group group1 --image-pool rbd --image-namespace test1 --image image1 2026-03-24T11:55:23.234 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group image rm rbd/test1/group1 rbd/test1/image2 2026-03-24T11:55:23.277 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd group rm rbd/test1/group1 2026-03-24T11:55:23.308 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash move rbd/test1/image1 2026-03-24T11:55:23.367 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash --namespace test1 ls 2026-03-24T11:55:23.367 INFO:tasks.workunit.client.0.vm05.stderr:++ cut -d ' ' -f 1 2026-03-24T11:55:23.393 INFO:tasks.workunit.client.0.vm05.stderr:+ ID=291227fadf91 2026-03-24T11:55:23.394 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash rm rbd/test1/291227fadf91 2026-03-24T11:55:23.551 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete...2026-03-24T11:55:23.554+0000 7f5212a9e640 0 -- 192.168.123.105:0/137710160 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x56427dac7380 msgr2=0x56427dba8f30 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:23.582 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete...2026-03-24T11:55:23.586+0000 7f5212a9e640 0 -- 192.168.123.105:0/137710160 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7f51f005bf70 msgr2=0x7f51f007c370 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T11:55:23.823 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete... Removing image: 100% complete...done. 2026-03-24T11:55:23.826 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd remove rbd/test1/image2 2026-03-24T11:55:23.937 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 1% complete... Removing image: 2% complete... Removing image: 3% complete... Removing image: 4% complete... Removing image: 5% complete... Removing image: 6% complete... Removing image: 7% complete... Removing image: 8% complete... Removing image: 9% complete... Removing image: 10% complete... Removing image: 11% complete... Removing image: 12% complete... Removing image: 13% complete... Removing image: 14% complete... Removing image: 15% complete... Removing image: 16% complete... Removing image: 17% complete... Removing image: 18% complete... Removing image: 19% complete... Removing image: 20% complete... Removing image: 21% complete... Removing image: 22% complete... Removing image: 23% complete... Removing image: 24% complete... Removing image: 25% complete... Removing image: 26% complete... Removing image: 27% complete... Removing image: 28% complete... Removing image: 29% complete... Removing image: 30% complete... Removing image: 31% complete... Removing image: 32% complete... Removing image: 33% complete... Removing image: 34% complete... Removing image: 35% complete... Removing image: 36% complete... Removing image: 37% complete... Removing image: 38% complete... Removing image: 39% complete... Removing image: 40% complete... Removing image: 41% complete... Removing image: 42% complete... Removing image: 43% complete... Removing image: 44% complete... Removing image: 45% complete... Removing image: 46% complete... Removing image: 47% complete... Removing image: 48% complete... Removing image: 49% complete... Removing image: 50% complete... Removing image: 51% complete... Removing image: 52% complete... Removing image: 53% complete... Removing image: 54% complete... Removing image: 55% complete... Removing image: 56% complete... Removing image: 57% complete... Removing image: 58% complete... Removing image: 59% complete... Removing image: 60% complete... Removing image: 61% complete... Removing image: 62% complete... Removing image: 63% complete... Removing image: 64% complete... Removing image: 65% complete... Removing image: 66% complete... Removing image: 67% complete... Removing image: 68% complete... Removing image: 69% complete... Removing image: 70% complete... Removing image: 71% complete... Removing image: 72% complete... Removing image: 73% complete... Removing image: 74% complete... Removing image: 75% complete... Removing image: 76% complete... Removing image: 77% complete... Removing image: 78% complete... Removing image: 79% complete... Removing image: 80% complete... Removing image: 81% complete... Removing image: 82% complete... Removing image: 83% complete... Removing image: 84% complete... Removing image: 85% complete... Removing image: 86% complete... Removing image: 87% complete... Removing image: 88% complete... Removing image: 89% complete... Removing image: 90% complete... Removing image: 91% complete... Removing image: 92% complete... Removing image: 93% complete... Removing image: 94% complete... Removing image: 95% complete... Removing image: 96% complete... Removing image: 97% complete... Removing image: 98% complete... Removing image: 99% complete...2026-03-24T11:55:23.938+0000 7f2c02f12640 0 -- 192.168.123.105:0/3375459555 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x55608b2e82d0 msgr2=0x55608b22f750 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T11:55:23.951 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T11:55:23.955 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace remove --pool rbd --namespace test1 2026-03-24T11:55:24.008 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace remove --namespace test3 2026-03-24T11:55:24.065 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace list 2026-03-24T11:55:24.065 INFO:tasks.workunit.client.0.vm05.stderr:+ grep test 2026-03-24T11:55:24.065 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:55:24.065 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:55:24.089 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:55:24.089 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace remove rbd/test2 2026-03-24T11:55:24.162 INFO:tasks.workunit.client.0.vm05.stdout:testing trash purge schedule... 2026-03-24T11:55:24.163 INFO:tasks.workunit.client.0.vm05.stderr:+ test_trash_purge_schedule 2026-03-24T11:55:24.163 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing trash purge schedule...' 2026-03-24T11:55:24.163 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:55:24.163 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:24.371 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:24.530 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:24.629 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:24.937 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.016 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.097 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.175 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.263 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.351 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.430 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.509 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.587 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.665 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.744 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.818 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:25.980 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:55:26.063 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T11:55:27.016 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T11:55:27.028 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T11:55:29.665 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T11:55:29.693 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd trash purge schedule list 2026-03-24T11:55:29.968 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{}' = '{}' 2026-03-24T11:55:29.969 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd trash purge schedule status 2026-03-24T11:55:29.969 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep '"scheduled": []' 2026-03-24T11:55:30.247 INFO:tasks.workunit.client.0.vm05.stdout: "scheduled": [] 2026-03-24T11:55:30.247 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T11:55:30.247 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:55:30.272 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.272 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-24T11:55:30.299 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T11:55:30.299 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-24T11:55:30.300 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove dummy 2026-03-24T11:55:30.324 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:55:30.326+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:55:30.325 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:55:30.328 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.328 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-24T11:55:30.328 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-24T11:55:30.352 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:55:30.354+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:55:30.352 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:55:30.356 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.356 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-24T11:55:30.356 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-24T11:55:30.379 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:55:30.382+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:55:30.379 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:55:30.383 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.383 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T11:55:30.383 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T11:55:30.406 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:55:30.410+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:55:30.406 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:55:30.410 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.410 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd 1d 01:30 2026-03-24T11:55:30.442 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-24T11:55:30.442 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T11:55:30.467 INFO:tasks.workunit.client.0.vm05.stdout:every 1d starting at 01:30:00 2026-03-24T11:55:30.467 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T11:55:30.467 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:55:30.693 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.693 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -R 2026-03-24T11:55:30.693 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T11:55:30.718 INFO:tasks.workunit.client.0.vm05.stdout:rbd - every 1d starting at 01:30:00 2026-03-24T11:55:30.719 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -R -p rbd 2026-03-24T11:55:30.719 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T11:55:30.745 INFO:tasks.workunit.client.0.vm05.stdout:rbd - every 1d starting at 01:30:00 2026-03-24T11:55:30.746 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-24T11:55:30.746 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-24T11:55:30.774 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:55:30.774 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T11:55:30.802 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T11:55:30.803 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd2/ns1 2d 2026-03-24T11:55:30.833 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T11:55:30.860 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[{"pool":"rbd2","namespace":"ns1","items":[{"interval":"2d","start_time":""}]}]' '!=' '[]' 2026-03-24T11:55:30.860 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-24T11:55:30.860 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *every 2d' 2026-03-24T11:55:30.889 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 every 2d 2026-03-24T11:55:30.889 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 2026-03-24T11:55:30.917 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -p rbd2 -R --format json 2026-03-24T11:55:30.946 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T11:55:30.946 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T11:55:30.947 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:55:30.948 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:55:30.948 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:55:30.977 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = rbd 2026-03-24T11:55:30.977 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:55:40.979 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:55:40.979 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:55:40.979 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:55:41.005 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = rbd 2026-03-24T11:55:41.007 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:55:51.006 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:55:51.006 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:55:51.006 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:55:51.033 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = rbd 2026-03-24T11:55:51.033 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:01.034 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:01.035 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:56:01.035 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:01.066 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = rbd 2026-03-24T11:56:01.066 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:11.067 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:11.067 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:56:11.067 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:11.094 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = rbd 2026-03-24T11:56:11.094 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:21.095 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:21.095 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:56:21.095 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:21.120 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd = rbd 2026-03-24T11:56:21.120 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:56:21.120 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status 2026-03-24T11:56:21.141 INFO:tasks.workunit.client.0.vm05.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-24T11:56:21.141 INFO:tasks.workunit.client.0.vm05.stdout:rbd 2026-03-25 01:30:00 2026-03-24T11:56:21.144 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:56:21.144 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:21.170 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd = rbd 2026-03-24T11:56:21.170 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-24T11:56:21.170 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:21.198 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd = rbd 2026-03-24T11:56:21.198 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add 2d 00:17 2026-03-24T11:56:21.229 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:56:21.230 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:56:21.254 INFO:tasks.workunit.client.0.vm05.stdout:every 2d starting at 00:17:00 2026-03-24T11:56:21.255 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -R 2026-03-24T11:56:21.255 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:56:21.279 INFO:tasks.workunit.client.0.vm05.stdout:- - every 2d starting at 00:17:00 2026-03-24T11:56:21.279 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule ls -p rbd2 2026-03-24T11:56:21.279 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd2 2026-03-24T11:56:21.306 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:56:21.306 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd2 -R 2026-03-24T11:56:21.306 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:56:21.333 INFO:tasks.workunit.client.0.vm05.stdout:- - every 2d starting at 00:17:00 2026-03-24T11:56:21.333 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd2/ns1 -R 2026-03-24T11:56:21.333 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:56:21.365 INFO:tasks.workunit.client.0.vm05.stdout:- - every 2d starting at 00:17:00 2026-03-24T11:56:21.366 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T11:56:21.366 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //schedules/schedule/pool 2026-03-24T11:56:21.399 INFO:tasks.workunit.client.0.vm05.stderr:+ test - = - 2026-03-24T11:56:21.399 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T11:56:21.399 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //schedules/schedule/namespace 2026-03-24T11:56:21.433 INFO:tasks.workunit.client.0.vm05.stderr:+ test - = - 2026-03-24T11:56:21.434 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -R -p rbd2/ns1 --format xml 2026-03-24T11:56:21.434 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //schedules/schedule/items/item/start_time 2026-03-24T11:56:21.469 INFO:tasks.workunit.client.0.vm05.stderr:+ test 00:17:00 = 00:17:00 2026-03-24T11:56:21.469 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T11:56:21.470 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:21.470 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:56:21.470 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:21.470 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:56:21.498 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:31.499 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:31.499 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:56:31.499 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:31.499 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:56:31.524 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:41.526 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:41.526 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:56:41.526 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:41.526 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:56:41.553 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:56:51.554 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:56:51.554 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:56:51.554 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:56:51.554 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:56:51.578 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:01.579 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:01.580 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:57:01.580 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:01.580 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:57:01.606 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:11.608 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:11.608 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:57:11.608 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:11.608 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:57:11.632 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:21.634 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:21.634 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:57:21.634 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:21.634 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:57:21.659 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 2026-03-24T11:57:21.659 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 2026-03-24T11:57:21.659 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:57:21.659 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status 2026-03-24T11:57:21.680 INFO:tasks.workunit.client.0.vm05.stdout:POOL NAMESPACE SCHEDULE TIME 2026-03-24T11:57:21.680 INFO:tasks.workunit.client.0.vm05.stdout:rbd 2026-03-25 01:30:00 2026-03-24T11:57:21.680 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 2026-03-26 00:17:00 2026-03-24T11:57:21.680 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-26 00:17:00 2026-03-24T11:57:21.683 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status --format xml 2026-03-24T11:57:21.683 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:21.683 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2 2026-03-24T11:57:21.709 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 2026-03-24T11:57:21.709 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 2026-03-24T11:57:21.709 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd rbd2 rbd2' 2026-03-24T11:57:21.709 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status --format xml 2026-03-24T11:57:21.709 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:21.735 INFO:tasks.workunit.client.0.vm05.stdout:rbd rbd2 rbd2 2026-03-24T11:57:21.735 INFO:tasks.workunit.client.0.vm05.stderr:+ echo rbd rbd2 rbd2 2026-03-24T11:57:21.736 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule status -p rbd --format xml 2026-03-24T11:57:21.736 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:21.762 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd = rbd 2026-03-24T11:57:21.763 INFO:tasks.workunit.client.0.vm05.stderr:+++ rbd trash purge schedule status -p rbd2 --format xml 2026-03-24T11:57:21.763 INFO:tasks.workunit.client.0.vm05.stderr:+++ xmlstarlet sel -t -v //scheduled/item/pool 2026-03-24T11:57:21.788 INFO:tasks.workunit.client.0.vm05.stderr:++ echo rbd2 rbd2 2026-03-24T11:57:21.789 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'rbd2 rbd2' = 'rbd2 rbd2' 2026-03-24T11:57:21.789 INFO:tasks.workunit.client.0.vm05.stderr:+++ rbd trash purge schedule ls -R --format xml 2026-03-24T11:57:21.789 INFO:tasks.workunit.client.0.vm05.stderr:+++ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-24T11:57:21.817 INFO:tasks.workunit.client.0.vm05.stderr:++ echo 2d00:17:00 1d01:30:00 2026-03-24T11:57:21.817 INFO:tasks.workunit.client.0.vm05.stderr:+ test '2d00:17:00 1d01:30:00' = '2d00:17:00 1d01:30:00' 2026-03-24T11:57:21.817 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add 1d 2026-03-24T11:57:21.849 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:57:21.849 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:57:21.873 INFO:tasks.workunit.client.0.vm05.stdout:every 1d, every 2d starting at 00:17:00 2026-03-24T11:57:21.873 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:57:21.873 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1d' 2026-03-24T11:57:21.898 INFO:tasks.workunit.client.0.vm05.stdout:every 1d, every 2d starting at 00:17:00 2026-03-24T11:57:21.898 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -R --format xml 2026-03-24T11:57:21.898 INFO:tasks.workunit.client.0.vm05.stderr:+ xmlstarlet sel -t -v //schedules/schedule/items 2026-03-24T11:57:21.898 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 2d00:17 2026-03-24T11:57:21.923 INFO:tasks.workunit.client.0.vm05.stdout:1d2d00:17:00 2026-03-24T11:57:21.924 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule rm 1d 2026-03-24T11:57:21.954 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:57:21.954 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2d starting at 00:17' 2026-03-24T11:57:21.981 INFO:tasks.workunit.client.0.vm05.stdout:every 2d starting at 00:17:00 2026-03-24T11:57:21.981 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule rm 2d 00:17 2026-03-24T11:57:22.012 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule ls 2026-03-24T11:57:22.012 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:57:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:57:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ for p in rbd2 rbd2/ns1 2026-03-24T11:57:22.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T11:57:22.069 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-24T11:57:22.108 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:57:22.108 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:57:22.108 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:57:22.132 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:57:22.132 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd2 1m 2026-03-24T11:57:22.159 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T11:57:22.159 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:57:22.184 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 - every 1m 2026-03-24T11:57:22.184 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T11:57:22.184 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:57:22.210 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 - every 1m 2026-03-24T11:57:22.210 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T11:57:22.211 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:22.211 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:57:22.211 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:57:22.211 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:57:22.236 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:57:22.236 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:32.237 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:32.237 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:57:32.237 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:57:32.238 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:57:32.262 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:57:32.263 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:42.264 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:42.264 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:57:42.264 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:57:42.264 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:57:42.289 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:57:42.289 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:57:52.290 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:57:52.291 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:57:52.291 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:57:52.291 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:57:52.317 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:57:52.317 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:02.319 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:02.319 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:02.319 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:02.319 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:02.344 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:58:02.344 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:02.345 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:02.345 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:58:02.369 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:58:02.369 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T11:58:02.370 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:58:02.395 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 - every 1m 2026-03-24T11:58:02.396 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T11:58:02.396 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:58:02.423 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 - every 1m 2026-03-24T11:58:02.423 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status 2026-03-24T11:58:02.423 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:58:02.447 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 11:59:00 2026-03-24T11:58:02.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-24T11:58:02.447 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:58:02.473 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 11:59:00 2026-03-24T11:58:02.473 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-24T11:58:02.473 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:58:02.500 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 11:59:00 2026-03-24T11:58:02.500 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule rm -p rbd2 1m 2026-03-24T11:58:02.527 INFO:tasks.workunit.client.0.vm05.stderr:+ for p in rbd2 rbd2/ns1 2026-03-24T11:58:02.527 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T11:58:02.557 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash mv rbd2/ns1/test1 2026-03-24T11:58:02.597 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:02.597 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:02.597 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:02.621 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:02.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd2/ns1 1m 2026-03-24T11:58:02.649 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T11:58:02.649 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:58:02.674 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 every 1m 2026-03-24T11:58:02.675 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T11:58:02.675 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:58:02.700 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 every 1m 2026-03-24T11:58:02.700 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T11:58:02.702 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:02.702 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:02.702 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:02.702 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:02.724 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:02.725 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:12.726 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:12.726 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:12.726 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:12.726 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:12.750 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:12.750 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:22.751 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:22.751 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:22.751 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:22.751 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:22.775 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:22.775 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:32.777 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:32.777 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:32.777 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:32.777 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:32.801 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:32.802 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:42.803 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:42.803 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:42.803 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:42.803 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:42.829 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:42.829 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:58:52.831 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:58:52.831 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:58:52.831 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:58:52.831 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:58:52.856 INFO:tasks.workunit.client.0.vm05.stdout:1 2026-03-24T11:58:52.856 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:59:02.858 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:59:02.858 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:59:02.858 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:59:02.858 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^1$' 2026-03-24T11:59:02.883 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:59:02.883 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash ls rbd2/ns1 2026-03-24T11:59:02.883 INFO:tasks.workunit.client.0.vm05.stderr:+ wc -l 2026-03-24T11:59:02.883 INFO:tasks.workunit.client.0.vm05.stderr:+ grep '^0$' 2026-03-24T11:59:02.907 INFO:tasks.workunit.client.0.vm05.stdout:0 2026-03-24T11:59:02.907 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2 -R 2026-03-24T11:59:02.907 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:59:02.933 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 every 1m 2026-03-24T11:59:02.933 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule list -p rbd2/ns1 -R 2026-03-24T11:59:02.933 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T11:59:02.959 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 every 1m 2026-03-24T11:59:02.959 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status 2026-03-24T11:59:02.959 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:59:02.984 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 12:00:00 2026-03-24T11:59:02.984 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status -p rbd2 2026-03-24T11:59:02.984 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:59:03.010 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 12:00:00 2026-03-24T11:59:03.010 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule status -p rbd2/ns1 2026-03-24T11:59:03.010 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1' 2026-03-24T11:59:03.036 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 2026-03-24 12:00:00 2026-03-24T11:59:03.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule rm -p rbd2/ns1 1m 2026-03-24T11:59:03.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add 2m 2026-03-24T11:59:03.095 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule add -p rbd dummy 2026-03-24T11:59:03.095 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd dummy 2026-03-24T11:59:03.117 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.119+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:03.117 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:03.120 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.120 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule add -p rbd 1d dummy 2026-03-24T11:59:03.120 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd 1d dummy 2026-03-24T11:59:03.144 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.143+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.144 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.146 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.146 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule add dummy 2026-03-24T11:59:03.146 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add dummy 2026-03-24T11:59:03.168 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.167+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:03.168 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:03.171 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.171 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule add 1d dummy 2026-03-24T11:59:03.171 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add 1d dummy 2026-03-24T11:59:03.392 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:59:03.391+0000 7fee0bfff640 0 --2- 192.168.123.105:0/4005663479 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x55d33c556420 0x55d33c565bf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:59:03.396 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.395+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.396 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.399 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.399 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove -p rbd dummy 2026-03-24T11:59:03.399 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd dummy 2026-03-24T11:59:03.422 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.423+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:03.423 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:03.426 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.426 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T11:59:03.426 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd 1d dummy 2026-03-24T11:59:03.450 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.451+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.450 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.453 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.453 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove dummy 2026-03-24T11:59:03.453 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove dummy 2026-03-24T11:59:03.475 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.475+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:03.475 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:03.478 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.478 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule remove 1d dummy 2026-03-24T11:59:03.478 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove 1d dummy 2026-03-24T11:59:03.501 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:03.503+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.501 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:03.504 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:03.504 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd 2026-03-24T11:59:03.504 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1d starting at 01:30' 2026-03-24T11:59:03.530 INFO:tasks.workunit.client.0.vm05.stdout:every 1d starting at 01:30:00 2026-03-24T11:59:03.530 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls 2026-03-24T11:59:03.530 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2m' 2026-03-24T11:59:03.554 INFO:tasks.workunit.client.0.vm05.stdout:every 2m 2026-03-24T11:59:03.555 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd 1d 01:30 2026-03-24T11:59:03.586 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove 2m 2026-03-24T11:59:03.619 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd trash purge schedule ls -R --format json 2026-03-24T11:59:03.645 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T11:59:03.645 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:59:03.645 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:03.725 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:03.804 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:03.885 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:03.967 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.048 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.376 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.458 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.538 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.815 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.895 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:04.977 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.059 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.140 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.221 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.302 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.382 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.663 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:05.742 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T11:59:06.159 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T11:59:06.172 INFO:tasks.workunit.client.0.vm05.stdout:testing recovery of trash_purge_schedule handler after module's RADOS client is blocklisted... 2026-03-24T11:59:06.172 INFO:tasks.workunit.client.0.vm05.stderr:+ test_trash_purge_schedule_recovery 2026-03-24T11:59:06.172 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing recovery of trash_purge_schedule handler after module'\''s RADOS client is blocklisted...' 2026-03-24T11:59:06.172 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:59:06.172 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.254 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.335 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.413 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.492 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.571 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.651 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.731 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.814 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:06.972 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.053 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.135 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.215 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.293 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.374 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.452 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.533 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:07.611 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd3 8 2026-03-24T11:59:08.155 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' already exists 2026-03-24T11:59:08.167 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd3 2026-03-24T11:59:11.125 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd3/ns1 2026-03-24T11:59:11.343 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T11:59:11.347+0000 7f517661f640 0 --2- 192.168.123.105:0/1125583508 >> v2:192.168.123.105:3300/0 conn(0x56414cfad720 0x56414cefc490 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T11:59:11.358 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd3/ns1 2d 2026-03-24T11:59:11.385 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T11:59:11.385 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T11:59:11.411 INFO:tasks.workunit.client.0.vm05.stdout:rbd3 ns1 every 2d 2026-03-24T11:59:11.411 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph mgr dump 2026-03-24T11:59:11.412 INFO:tasks.workunit.client.0.vm05.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T11:59:11.412 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.active_clients[]' 2026-03-24T11:59:11.412 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T11:59:11.693 INFO:tasks.workunit.client.0.vm05.stderr:+ CLIENT_ADDR=192.168.123.105:0/4260664610 2026-03-24T11:59:11.694 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd blocklist add 192.168.123.105:0/4260664610 2026-03-24T11:59:13.124 INFO:tasks.workunit.client.0.vm05.stderr:blocklisting 192.168.123.105:0/4260664610 until 2026-03-24T12:59:12.185962+0000 (3600 sec) 2026-03-24T11:59:13.144 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd trash purge schedule add -p rbd3 10m 2026-03-24T11:59:13.144 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T11:59:13.169 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:13.171+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-24T11:59:13.169 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd trash purge schedule add failed: (11) Resource temporarily unavailable: [errno 108] RADOS connection was shutdown (Failed to operate write op for oid rbd_trash_purge_schedule) 2026-03-24T11:59:13.172 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:13.172 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:59:23.173 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 24 2026-03-24T11:59:23.174 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T11:59:23.174 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule add -p rbd3 10m 2026-03-24T11:59:23.202 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T11:59:23.202 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T11:59:23.202 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 10m' 2026-03-24T11:59:23.229 INFO:tasks.workunit.client.0.vm05.stdout:rbd3 - every 10m 2026-03-24T11:59:23.229 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T11:59:23.230 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T11:59:23.255 INFO:tasks.workunit.client.0.vm05.stdout:rbd3 ns1 every 2d 2026-03-24T11:59:23.255 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd3 10m 2026-03-24T11:59:23.282 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule remove -p rbd3/ns1 2d 2026-03-24T11:59:23.308 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T11:59:23.308 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'every 10m' 2026-03-24T11:59:23.308 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 10m' 2026-03-24T11:59:23.333 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:23.333 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd trash purge schedule ls -p rbd3 -R 2026-03-24T11:59:23.333 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'rbd3 *ns1 *every 2d' 2026-03-24T11:59:23.333 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd3 *ns1 *every 2d' 2026-03-24T11:59:23.359 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:23.359 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T11:59:23.738 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' does not exist 2026-03-24T11:59:23.751 INFO:tasks.workunit.client.0.vm05.stdout:testing mirror snapshot schedule... 2026-03-24T11:59:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ test_mirror_snapshot_schedule 2026-03-24T11:59:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing mirror snapshot schedule...' 2026-03-24T11:59:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T11:59:23.751 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:23.833 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:23.911 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:23.995 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.073 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.150 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.304 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.382 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.458 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.536 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.614 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.897 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:24.976 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:25.057 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:25.136 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:25.216 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:25.296 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T11:59:25.376 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T11:59:25.745 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T11:59:25.757 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T11:59:28.792 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd2/ns1 2026-03-24T11:59:28.820 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd2 image 2026-03-24T11:59:28.848 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd2/ns1 image 2026-03-24T11:59:28.875 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool peer add rbd2 cluster1 2026-03-24T11:59:28.898 INFO:tasks.workunit.client.0.vm05.stdout:9f2319f8-8a85-4933-a234-35da93abf250 2026-03-24T11:59:28.901 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd mirror snapshot schedule list 2026-03-24T11:59:29.174 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{}' = '{}' 2026-03-24T11:59:29.174 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd mirror snapshot schedule status 2026-03-24T11:59:29.174 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep '"scheduled_images": []' 2026-03-24T11:59:29.443 INFO:tasks.workunit.client.0.vm05.stdout: "scheduled_images": [] 2026-03-24T11:59:29.443 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T11:59:29.444 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T11:59:29.468 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:29.468 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-24T11:59:29.492 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T11:59:29.493 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 rbd2/ns1/test1 2026-03-24T11:59:29.523 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T11:59:29.524 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T11:59:29.548 INFO:tasks.workunit.client.0.vm05.stderr:rbd: mirroring not enabled on the image 2026-03-24T11:59:29.552 INFO:tasks.workunit.client.0.vm05.stderr:+ test 0 = 0 2026-03-24T11:59:29.552 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror image enable rbd2/ns1/test1 snapshot 2026-03-24T11:59:29.802 INFO:tasks.workunit.client.0.vm05.stdout:Mirroring enabled 2026-03-24T11:59:29.808 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T11:59:29.809 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T11:59:29.841 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 = 1 2026-03-24T11:59:29.841 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-24T11:59:29.841 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-24T11:59:29.864 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:29.864+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:29.864 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:29.866 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:29.866 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-24T11:59:29.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-24T11:59:29.889 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:29.892+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:29.890 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:29.893 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:29.893 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T11:59:29.893 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T11:59:29.923 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:29.924+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T11:59:29.923 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T11:59:29.926 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:29.926 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T11:59:29.926 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T11:59:29.955 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T11:59:29.956+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:29.955 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T11:59:29.958 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:29.958 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1m 2026-03-24T11:59:29.996 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T11:59:29.996 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T11:59:30.022 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:30.022 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T11:59:30.022 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T11:59:30.046 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T11:59:30.046 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T11:59:30.046 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T11:59:30.073 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:30.073 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T11:59:30.073 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T11:59:30.099 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T11:59:30.100 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T11:59:30.100 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T11:59:30.127 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T11:59:30.127 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T11:59:30.127 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T11:59:30.156 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T11:59:30.156 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T11:59:30.188 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T11:59:30.188 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T11:59:30.189 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:59:30.189 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T11:59:30.189 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T11:59:30.220 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T11:59:30.220 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:59:40.221 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:59:40.221 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T11:59:40.221 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T11:59:40.255 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T11:59:40.255 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T11:59:50.256 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T11:59:50.256 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T11:59:50.256 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T11:59:50.288 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T11:59:50.288 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:00.289 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:00.289 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:00.289 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:00.322 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:00.322 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:10.323 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:10.323 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:10.323 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:10.356 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:10.356 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:20.357 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:20.357 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:20.357 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:20.388 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:20.388 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:30.389 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:30.389 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:30.389 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:30.420 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:30.420 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:40.421 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:40.421 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:40.421 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:40.455 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:40.455 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:00:50.456 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:00:50.456 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:00:50.456 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:00:50.488 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:00:50.489 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:00.489 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:00.490 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:01:00.490 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:01:00.525 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 -gt 1 2026-03-24T12:01:00.525 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:10.526 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:10.526 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:01:10.526 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:01:10.560 INFO:tasks.workunit.client.0.vm05.stderr:+ test 2 -gt 1 2026-03-24T12:01:10.560 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:01:10.561 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd2/ns1/test1 2026-03-24T12:01:10.561 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:01:10.595 INFO:tasks.workunit.client.0.vm05.stderr:+ test 2 -gt 1 2026-03-24T12:01:10.595 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls 2026-03-24T12:01:10.595 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls 2026-03-24T12:01:10.621 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:01:10.621 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T12:01:10.621 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:01:10.646 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:01:10.647 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T12:01:10.647 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T12:01:10.673 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:01:10.673 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T12:01:10.673 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:01:10.701 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:01:10.701 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T12:01:10.701 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T12:01:10.728 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:01:10.728 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T12:01:10.728 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:01:10.757 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:01:10.758 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T12:01:10.794 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T12:01:10.794 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:10.817 INFO:tasks.workunit.client.0.vm05.stdout:SCHEDULE TIME IMAGE 2026-03-24T12:01:10.817 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:02:00 rbd2/ns1/test1 2026-03-24T12:01:10.821 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule status --format xml 2026-03-24T12:01:10.821 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T12:01:10.848 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T12:01:10.849 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule status -p rbd2 --format xml 2026-03-24T12:01:10.849 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T12:01:11.078 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T12:01:11.079 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --format xml 2026-03-24T12:01:11.079 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T12:01:11.107 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T12:01:11.107 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule status -p rbd2/ns1 --image test1 --format xml 2026-03-24T12:01:11.107 INFO:tasks.workunit.client.0.vm05.stderr:++ xmlstarlet sel -t -v //scheduled_images/image/image 2026-03-24T12:01:11.140 INFO:tasks.workunit.client.0.vm05.stderr:+ test rbd2/ns1/test1 = rbd2/ns1/test1 2026-03-24T12:01:11.140 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror image demote rbd2/ns1/test1 2026-03-24T12:01:11.794 INFO:tasks.workunit.client.0.vm05.stdout:Image demoted to non-primary 2026-03-24T12:01:11.799 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T12:01:11.800 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:11.800 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:11.801 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:11.825 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:02:00 rbd2/ns1/test1 2026-03-24T12:01:11.825 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:21.826 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:21.826 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:21.826 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:21.850 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:01:21.850 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:21.850 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-24T12:01:21.850 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:21.876 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:01:21.876 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror image promote rbd2/ns1/test1 2026-03-24T12:01:22.007 INFO:tasks.workunit.client.0.vm05.stdout:Image promoted to primary 2026-03-24T12:01:22.012 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T12:01:22.013 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:22.013 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:22.013 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:22.035 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:32.036 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:32.036 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:32.036 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:32.061 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:42.062 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:42.062 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:42.062 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:42.090 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:01:52.091 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:01:52.091 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:01:52.092 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:01:52.119 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:02.120 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:02.120 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:02.120 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:02.144 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:12.145 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:12.146 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:12.146 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:12.175 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:22.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:22.176 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:22.176 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:22.405 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:22.405 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:02:22.405 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:22.405 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:22.430 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:22.430 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add 1h 00:15 2026-03-24T12:02:22.463 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls 2026-03-24T12:02:22.488 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-24T12:02:22.488 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T12:02:22.488 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T12:02:22.513 INFO:tasks.workunit.client.0.vm05.stdout:- - - every 1h starting at 00:15:00 2026-03-24T12:02:22.514 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -R 2026-03-24T12:02:22.514 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:02:22.538 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:02:22.538 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2 2026-03-24T12:02:22.538 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 2026-03-24T12:02:22.563 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.563 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T12:02:22.563 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T12:02:22.590 INFO:tasks.workunit.client.0.vm05.stdout:- - - every 1h starting at 00:15:00 2026-03-24T12:02:22.590 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2 -R 2026-03-24T12:02:22.590 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:02:22.616 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:02:22.616 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T12:02:22.616 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 2026-03-24T12:02:22.642 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.642 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T12:02:22.642 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1h starting at 00:15:00' 2026-03-24T12:02:22.669 INFO:tasks.workunit.client.0.vm05.stdout:- - - every 1h starting at 00:15:00 2026-03-24T12:02:22.669 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd2/ns1 -R 2026-03-24T12:02:22.669 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'rbd2 *ns1 *test1 *every 1m' 2026-03-24T12:02:22.695 INFO:tasks.workunit.client.0.vm05.stdout:rbd2 ns1 test1 every 1m 2026-03-24T12:02:22.696 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T12:02:22.728 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T12:02:22.728 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule add dummy 2026-03-24T12:02:22.728 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add dummy 2026-03-24T12:02:22.749 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.752+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T12:02:22.750 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T12:02:22.752 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.752 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule add 1h dummy 2026-03-24T12:02:22.752 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add 1h dummy 2026-03-24T12:02:22.775 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.776+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.775 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.777 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.777 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-24T12:02:22.777 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 dummy 2026-03-24T12:02:22.805 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.808+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T12:02:22.806 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T12:02:22.808 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.809 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-24T12:02:22.809 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd2/ns1 --image test1 1h dummy 2026-03-24T12:02:22.838 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.840+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.839 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.842 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.842 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove dummy 2026-03-24T12:02:22.842 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove dummy 2026-03-24T12:02:22.863 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.864+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T12:02:22.863 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T12:02:22.866 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.866 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove 1h dummy 2026-03-24T12:02:22.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove 1h dummy 2026-03-24T12:02:22.888 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.892+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.888 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.891 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.891 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T12:02:22.891 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 dummy 2026-03-24T12:02:22.919 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.921+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid interval (dummy) 2026-03-24T12:02:22.919 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid interval (dummy) 2026-03-24T12:02:22.922 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.922 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T12:02:22.922 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove -p rbd2/ns1 --image test1 1h dummy 2026-03-24T12:02:22.951 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:02:22.953+0000 7f1d10701640 -1 mgr.server reply reply (22) Invalid argument Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.952 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule remove failed: (22) Invalid argument: Invalid start time dummy: Unknown string format: dummy 2026-03-24T12:02:22.954 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:02:22.955 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls 2026-03-24T12:02:22.978 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1h starting at 00:15:00' = 'every 1h starting at 00:15:00' 2026-03-24T12:02:22.978 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -p rbd2/ns1 --image test1 2026-03-24T12:02:23.009 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T12:02:23.009 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd2/ns1/test1 2026-03-24T12:02:25.476 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:02:25.481+0000 7fdced988640 0 -- 192.168.123.105:0/3678838862 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x557c8437b470 msgr2=0x557c8435ebc0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1)._try_send injecting socket failure 2026-03-24T12:02:26.484 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:02:26.489+0000 7fdced988640 0 -- 192.168.123.105:0/3678838862 >> [v2:192.168.123.105:6800/4104923970,v1:192.168.123.105:6801/4104923970] conn(0x7fdccc05d290 msgr2=0x7fdccc07d670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T12:02:26.504 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T12:02:26.508 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 12 2026-03-24T12:02:26.508 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:26.509 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:26.509 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:26.533 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:26.533 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:36.534 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:36.534 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:36.534 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:36.559 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:36.559 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:46.560 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:46.560 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:46.560 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:46.585 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:46.585 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:02:56.586 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:02:56.586 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:02:56.586 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:02:56.611 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:03:00 rbd2/ns1/test1 2026-03-24T12:02:56.612 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:03:06.613 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:03:06.613 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:03:06.613 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:03:06.635 INFO:tasks.workunit.client.0.vm05.stdout:2026-03-24 12:04:00 rbd2/ns1/test1 2026-03-24T12:03:06.636 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:03:16.637 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 12` 2026-03-24T12:03:16.637 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:03:16.637 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:03:16.663 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:03:16.663 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule status 2026-03-24T12:03:16.663 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep rbd2/ns1/test1 2026-03-24T12:03:16.663 INFO:tasks.workunit.client.0.vm05.stderr:+ grep rbd2/ns1/test1 2026-03-24T12:03:16.686 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:03:16.686 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule remove 2026-03-24T12:03:16.716 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -R --format json 2026-03-24T12:03:16.740 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T12:03:16.740 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:03:16.740 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:16.819 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:16.894 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:16.971 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.046 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.121 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.200 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.277 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.355 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.432 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.911 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:17.988 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.135 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.232 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.311 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.390 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.468 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.550 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:18.626 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T12:03:19.771 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T12:03:19.784 INFO:tasks.workunit.client.0.vm05.stdout:testing recovery of mirror snapshot scheduler after module's RADOS client is blocklisted... 2026-03-24T12:03:19.785 INFO:tasks.workunit.client.0.vm05.stderr:+ test_mirror_snapshot_schedule_recovery 2026-03-24T12:03:19.785 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing recovery of mirror snapshot scheduler after module'\''s RADOS client is blocklisted...' 2026-03-24T12:03:19.785 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:03:19.785 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:19.864 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:19.941 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.018 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.094 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.170 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.253 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.334 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.410 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.486 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.568 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.647 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.726 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.806 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.888 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:20.968 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:21.047 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:21.126 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:03:21.205 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd3 8 2026-03-24T12:03:21.787 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' already exists 2026-03-24T12:03:21.800 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd3 2026-03-24T12:03:24.749 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd3/ns1 2026-03-24T12:03:24.778 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd3 image 2026-03-24T12:03:24.805 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd3/ns1 image 2026-03-24T12:03:24.831 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool peer add rbd3 cluster1 2026-03-24T12:03:24.855 INFO:tasks.workunit.client.0.vm05.stdout:510e112c-55c7-42b3-9ffc-f192e7bddd8b 2026-03-24T12:03:24.858 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 -s 1 rbd3/ns1/test1 2026-03-24T12:03:24.888 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror image enable rbd3/ns1/test1 snapshot 2026-03-24T12:03:25.770 INFO:tasks.workunit.client.0.vm05.stdout:Mirroring enabled 2026-03-24T12:03:25.777 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror image status rbd3/ns1/test1 2026-03-24T12:03:25.777 INFO:tasks.workunit.client.0.vm05.stderr:++ grep -c mirror.primary 2026-03-24T12:03:25.811 INFO:tasks.workunit.client.0.vm05.stderr:+ test 1 = 1 2026-03-24T12:03:25.811 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 1m 2026-03-24T12:03:25.845 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T12:03:25.879 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'every 1m' = 'every 1m' 2026-03-24T12:03:25.879 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph mgr dump 2026-03-24T12:03:25.880 INFO:tasks.workunit.client.0.vm05.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T12:03:25.880 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.active_clients[]' 2026-03-24T12:03:25.880 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T12:03:26.153 INFO:tasks.workunit.client.0.vm05.stderr:+ CLIENT_ADDR=192.168.123.105:0/157535646 2026-03-24T12:03:26.153 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd blocklist add 192.168.123.105:0/157535646 2026-03-24T12:03:27.764 INFO:tasks.workunit.client.0.vm05.stderr:blocklisting 192.168.123.105:0/157535646 until 2026-03-24T13:03:26.828252+0000 (3600 sec) 2026-03-24T12:03:27.779 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:03:27.779 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:03:27.804 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:03:27.809+0000 7f1d10701640 -1 librbd::api::Namespace: list: error listing namespaces: (108) Cannot send after transport endpoint shutdown 2026-03-24T12:03:27.805 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:03:27.809+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-24T12:03:27.805 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: [errno 108] RBD connection was shutdown (error listing namespaces) 2026-03-24T12:03:27.808 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:03:27.808 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:03:37.810 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 24 2026-03-24T12:03:37.811 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:03:37.811 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:03:37.835 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:03:37.841+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:03:37.835 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:03:37.838 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:03:47.840 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:03:47.840 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:03:47.865 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:03:47.869+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:03:47.865 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:03:47.868 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:03:57.870 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:03:57.870 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:03:57.893 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:03:57.897+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:03:57.894 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:03:57.896 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:04:07.898 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:04:07.898 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:04:07.928 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:04:07.933+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:04:07.928 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:04:07.932 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:04:17.933 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:04:17.933 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:04:17.958 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:04:17.961+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:04:17.958 INFO:tasks.workunit.client.0.vm05.stderr:rbd: rbd mirror snapshot schedule add failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:04:17.961 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:04:27.963 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:04:27.963 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule add -p rbd3/ns1 --image test1 2m 2026-03-24T12:04:28.003 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:04:28.003 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T12:04:28.003 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2m' 2026-03-24T12:04:28.040 INFO:tasks.workunit.client.0.vm05.stdout:every 1m, every 2m 2026-03-24T12:04:28.040 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T12:04:28.040 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T12:04:28.076 INFO:tasks.workunit.client.0.vm05.stdout:every 1m, every 2m 2026-03-24T12:04:28.077 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 2m 2026-03-24T12:04:28.129 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule rm -p rbd3/ns1 --image test1 1m 2026-03-24T12:04:28.164 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T12:04:28.165 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'every 2m' 2026-03-24T12:04:28.165 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 2m' 2026-03-24T12:04:28.202 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:04:28.202 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror snapshot schedule ls -p rbd3/ns1 --image test1 2026-03-24T12:04:28.202 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'every 1m' 2026-03-24T12:04:28.203 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'every 1m' 2026-03-24T12:04:28.237 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:04:28.237 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap purge rbd3/ns1/test1 2026-03-24T12:04:28.293 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd rm rbd3/ns1/test1 2026-03-24T12:04:29.061 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:04:29.065+0000 7f7d37f2f640 0 -- 192.168.123.105:0/2180649458 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x5610606a0730 msgr2=0x5610606d34d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T12:04:29.067 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:04:29.073+0000 7f7d37f2f640 0 -- 192.168.123.105:0/2180649458 >> [v2:192.168.123.105:6816/951022638,v1:192.168.123.105:6817/951022638] conn(0x7f7d1805cd80 msgr2=0x7f7d1807d160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T12:04:29.073 INFO:tasks.workunit.client.0.vm05.stderr: Removing image: 100% complete...done. 2026-03-24T12:04:29.077 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T12:04:30.100 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' does not exist 2026-03-24T12:04:30.114 INFO:tasks.workunit.client.0.vm05.stdout:testing perf image iostat... 2026-03-24T12:04:30.114 INFO:tasks.workunit.client.0.vm05.stderr:+ test_perf_image_iostat 2026-03-24T12:04:30.114 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing perf image iostat...' 2026-03-24T12:04:30.114 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:04:30.114 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.236 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.312 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.630 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.739 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.860 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:30.949 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.038 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.124 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.207 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.292 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.380 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.469 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.559 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.648 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.737 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.832 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:31.925 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:04:32.362 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd1 8 2026-03-24T12:04:33.117 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd1' already exists 2026-03-24T12:04:33.131 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd1 2026-03-24T12:04:36.148 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd1/ns 2026-03-24T12:04:36.180 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T12:04:37.233 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T12:04:37.247 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T12:04:39.960 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd2/ns 2026-03-24T12:04:39.989 INFO:tasks.workunit.client.0.vm05.stderr:+ IMAGE_SPECS=("test1" "rbd1/test2" "rbd1/ns/test3" "rbd2/test4" "rbd2/ns/test5") 2026-03-24T12:04:39.990 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:39.990 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' test1 2026-03-24T12:04:40.069 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.070 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/test2 2026-03-24T12:04:40.102 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.102 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd1/ns/test3 2026-03-24T12:04:40.138 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.139 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/test4 2026-03-24T12:04:40.176 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.176 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd2/ns/test5 2026-03-24T12:04:40.446 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS=() 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false test1 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/test2 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd1/ns/test3 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:04:40.447 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/ns/test5 2026-03-24T12:04:40.448 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd1 2026-03-24T12:04:40.451 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd2/test4 2026-03-24T12:04:40.455 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:04:40.521 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:04:40.521 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:04:50.528 INFO:tasks.workunit.client.0.vm05.stderr:+ test test2 = test2 2026-03-24T12:04:50.528 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:04:50.529 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd1/ns 2026-03-24T12:04:50.577 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:04:50.577 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:05:00.587 INFO:tasks.workunit.client.0.vm05.stderr:+ test test3 = test3 2026-03-24T12:05:00.592 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd1 /ns 2026-03-24T12:05:00.592 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:00.661 INFO:tasks.workunit.client.0.vm05.stderr:+ test test3 = test3 2026-03-24T12:05:00.661 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json --pool rbd2 2026-03-24T12:05:00.662 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:00.739 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:05:00.739 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:05:10.749 INFO:tasks.workunit.client.0.vm05.stderr:+ test test4 = test4 2026-03-24T12:05:10.749 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json --pool rbd2 --namespace ns 2026-03-24T12:05:10.749 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:10.851 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:05:10.852 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:05:20.864 INFO:tasks.workunit.client.0.vm05.stderr:+ test test5 = test5 2026-03-24T12:05:20.864 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json --rbd-default-pool rbd2 --namespace ns 2026-03-24T12:05:20.865 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:20.930 INFO:tasks.workunit.client.0.vm05.stderr:+ test test5 = test5 2026-03-24T12:05:20.931 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json 2026-03-24T12:05:20.933 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:21.058 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:05:21.058 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ test 'test1 test2 test3 test4 test5' = 'test1 test2 test3 test4 test5' 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 83370 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 83371 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 83372 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 83373 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 83374 2026-03-24T12:05:36.071 INFO:tasks.workunit.client.0.vm05.stderr:+ wait 2026-03-24T12:05:36.121 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:05:36.121 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:36.209 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:36.290 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:36.374 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:36.508 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:36.980 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.068 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.151 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.234 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.316 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.398 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.487 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.574 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:37.658 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:40.975 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:41.060 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:41.143 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:41.228 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:41.318 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T12:05:42.227 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T12:05:42.254 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-24T12:05:43.117 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd1' does not exist 2026-03-24T12:05:43.135 INFO:tasks.workunit.client.0.vm05.stdout:testing recovery of perf handler after module's RADOS client is blocklisted... 2026-03-24T12:05:43.135 INFO:tasks.workunit.client.0.vm05.stderr:+ test_perf_image_iostat_recovery 2026-03-24T12:05:43.135 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing recovery of perf handler after module'\''s RADOS client is blocklisted...' 2026-03-24T12:05:43.135 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:05:43.135 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.252 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.461 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.543 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.625 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.707 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.793 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.877 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:43.963 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.044 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.128 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.213 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.299 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.585 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.667 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.749 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.828 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.913 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:05:44.997 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd3 8 2026-03-24T12:05:46.098 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' already exists 2026-03-24T12:05:46.111 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd3 2026-03-24T12:05:46.329 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:05:46.334+0000 7fa39f8eb640 0 --2- 192.168.123.105:0/4167522316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x5562fc4ee0a0 0x5562fc500570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 2026-03-24T12:05:49.064 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd namespace create rbd3/ns 2026-03-24T12:05:49.092 INFO:tasks.workunit.client.0.vm05.stderr:+ IMAGE_SPECS=("rbd3/test1" "rbd3/ns/test2") 2026-03-24T12:05:49.092 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:05:49.092 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/test1 2026-03-24T12:05:49.124 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:05:49.124 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 10G --rbd-default-data-pool '' rbd3/ns/test2 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS=() 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ for spec in "${IMAGE_SPECS[@]}" 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/test1 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ BENCH_PIDS+=($!) 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern rand --io-total 10G --io-threads 1 --rbd-cache false rbd3/ns/test2 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd3 2026-03-24T12:05:49.157 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:05:49.181 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:05:49.181 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:05:59.190 INFO:tasks.workunit.client.0.vm05.stderr:+ test test1 = test1 2026-03-24T12:05:59.192 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph mgr dump 2026-03-24T12:05:59.192 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.active_clients[]' 2026-03-24T12:05:59.194 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T12:05:59.204 INFO:tasks.workunit.client.0.vm05.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T12:05:59.583 INFO:tasks.workunit.client.0.vm05.stderr:+ CLIENT_ADDR=192.168.123.105:0/1389144683 2026-03-24T12:05:59.583 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd blocklist add 192.168.123.105:0/1389144683 2026-03-24T12:06:01.074 INFO:tasks.workunit.client.0.vm05.stderr:blocklisting 192.168.123.105:0/1389144683 until 2026-03-24T13:06:00.150543+0000 (3600 sec) 2026-03-24T12:06:01.092 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd perf image iostat --format json rbd3/ns 2026-03-24T12:06:01.092 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd perf image iostat --format json rbd3/ns 2026-03-24T12:06:01.157 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:06:01.157 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:06:06.156 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:06:06.162+0000 7f1d0bef8640 -1 librbd::api::Image: list_images_v2: error listing image in directory: (108) Cannot send after transport endpoint shutdown 2026-03-24T12:06:06.156 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:06:06.162+0000 7f1d0bef8640 -1 librbd::api::Image: list_images: error listing v2 images: (108) Cannot send after transport endpoint shutdown 2026-03-24T12:06:06.158 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:06:06.162+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:06:06.158 INFO:tasks.workunit.client.0.vm05.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:06:06.161 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:06:06.161 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:06:16.164 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 24 2026-03-24T12:06:16.165 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:06:16.166 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T12:06:16.169 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:06:16.213 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:06:16.218+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:06:16.213 INFO:tasks.workunit.client.0.vm05.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:06:16.222 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = test2 2026-03-24T12:06:16.222 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:06:26.225 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:06:26.226 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T12:06:26.226 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:06:26.250 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:06:26.254+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:06:26.250 INFO:tasks.workunit.client.0.vm05.stderr:rbd: mgr command failed: (11) Resource temporarily unavailable: rbd_support module is not ready, try again 2026-03-24T12:06:26.253 INFO:tasks.workunit.client.0.vm05.stderr:+ test '' = test2 2026-03-24T12:06:26.253 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:06:36.255 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:06:36.257 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd perf image iostat --format json rbd3/ns 2026-03-24T12:06:36.258 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r 'map(.image) | sort | join(" ")' 2026-03-24T12:06:36.322 INFO:tasks.workunit.client.0.vm05.stderr:rbd: waiting for initial image stats 2026-03-24T12:06:36.322 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ test test2 = test2 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 85303 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ for pid in "${BENCH_PIDS[@]}" 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ kill 85304 2026-03-24T12:06:46.333 INFO:tasks.workunit.client.0.vm05.stderr:+ wait 2026-03-24T12:06:46.358 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:06:46.358 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.449 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.532 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.619 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.706 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.789 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.873 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:46.958 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.042 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.129 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.219 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.310 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.393 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.477 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.560 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.642 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.725 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.809 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:47.892 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd3 rbd3 --yes-i-really-really-mean-it 2026-03-24T12:06:48.866 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd3' does not exist 2026-03-24T12:06:48.884 INFO:tasks.workunit.client.0.vm05.stderr:+ test_mirror_pool_peer_bootstrap_create 2026-03-24T12:06:48.884 INFO:tasks.workunit.client.0.vm05.stdout:testing mirror pool peer bootstrap create... 2026-03-24T12:06:48.884 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing mirror pool peer bootstrap create...' 2026-03-24T12:06:48.884 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:06:48.884 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.141 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.229 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.320 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.408 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.495 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.584 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.667 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.752 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.837 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:49.924 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.029 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.119 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.206 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.282 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.358 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.438 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.533 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:06:50.624 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd1 8 2026-03-24T12:06:51.805 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd1' already exists 2026-03-24T12:06:51.818 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd1 2026-03-24T12:06:54.773 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd1 image 2026-03-24T12:06:54.802 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T12:06:55.831 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T12:06:55.845 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T12:06:58.793 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd mirror pool enable rbd2 pool 2026-03-24T12:06:58.822 INFO:tasks.workunit.client.0.vm05.stderr:+ readarray -t MON_ADDRS 2026-03-24T12:06:58.822 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph mon dump 2026-03-24T12:06:58.822 INFO:tasks.workunit.client.0.vm05.stderr:++ sed -n 's/^[0-9]: \(.*\) mon\.[a-z]$/\1/p' 2026-03-24T12:06:59.105 INFO:tasks.workunit.client.0.vm05.stderr:dumped monmap epoch 1 2026-03-24T12:06:59.118 INFO:tasks.workunit.client.0.vm05.stderr:+ BAD_MON_ADDR=1.2.3.4:6789 2026-03-24T12:06:59.118 INFO:tasks.workunit.client.0.vm05.stderr:+ MON_HOST='[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],1.2.3.4:6789' 2026-03-24T12:06:59.119 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],1.2.3.4:6789' rbd1 2026-03-24T12:06:59.119 INFO:tasks.workunit.client.0.vm05.stderr:++ base64 -d 2026-03-24T12:06:59.150 INFO:tasks.workunit.client.0.vm05.stderr:+ TOKEN='{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' 2026-03-24T12:06:59.150 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r .fsid 2026-03-24T12:06:59.161 INFO:tasks.workunit.client.0.vm05.stderr:+ TOKEN_FSID=a0c8ea99-5654-4097-bf98-f2a2e799bc82 2026-03-24T12:06:59.161 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r .client_id 2026-03-24T12:06:59.170 INFO:tasks.workunit.client.0.vm05.stderr:+ TOKEN_CLIENT_ID=rbd-mirror-peer 2026-03-24T12:06:59.170 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r .key 2026-03-24T12:06:59.179 INFO:tasks.workunit.client.0.vm05.stderr:+ TOKEN_KEY=AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A== 2026-03-24T12:06:59.179 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r .mon_host 2026-03-24T12:06:59.188 INFO:tasks.workunit.client.0.vm05.stderr:+ TOKEN_MON_HOST='[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]' 2026-03-24T12:06:59.189 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph fsid 2026-03-24T12:06:59.471 INFO:tasks.workunit.client.0.vm05.stderr:+ test a0c8ea99-5654-4097-bf98-f2a2e799bc82 = a0c8ea99-5654-4097-bf98-f2a2e799bc82 2026-03-24T12:06:59.471 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph auth get-key client.rbd-mirror-peer 2026-03-24T12:06:59.749 INFO:tasks.workunit.client.0.vm05.stderr:+ test AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A== = AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A== 2026-03-24T12:06:59.749 INFO:tasks.workunit.client.0.vm05.stderr:+ for addr in "${MON_ADDRS[@]}" 2026-03-24T12:06:59.749 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep '[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]' 2026-03-24T12:06:59.751 INFO:tasks.workunit.client.0.vm05.stdout:[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] 2026-03-24T12:06:59.751 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail fgrep 1.2.3.4:6789 2026-03-24T12:06:59.751 INFO:tasks.workunit.client.0.vm05.stderr:+ fgrep 1.2.3.4:6789 2026-03-24T12:06:59.752 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:06:59.753 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],1.2.3.4:6789' rbd1 2026-03-24T12:06:59.753 INFO:tasks.workunit.client.0.vm05.stderr:++ base64 -d 2026-03-24T12:06:59.782 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' = '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' 2026-03-24T12:06:59.782 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror pool peer bootstrap create rbd1 2026-03-24T12:06:59.782 INFO:tasks.workunit.client.0.vm05.stderr:++ base64 -d 2026-03-24T12:06:59.813 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' = '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' 2026-03-24T12:06:59.813 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror pool peer bootstrap create --mon-host '[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],1.2.3.4:6789' rbd2 2026-03-24T12:06:59.813 INFO:tasks.workunit.client.0.vm05.stderr:++ base64 -d 2026-03-24T12:06:59.842 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' = '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' 2026-03-24T12:06:59.843 INFO:tasks.workunit.client.0.vm05.stderr:++ rbd mirror pool peer bootstrap create rbd2 2026-03-24T12:06:59.843 INFO:tasks.workunit.client.0.vm05.stderr:++ base64 -d 2026-03-24T12:06:59.871 INFO:tasks.workunit.client.0.vm05.stderr:+ test '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' = '{"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","client_id":"rbd-mirror-peer","key":"AQBjfsJpSSMxCRAAn8gggNFcp3BgkLlEdacI8A==","mon_host":"[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]"}' 2026-03-24T12:06:59.871 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T12:07:00.133 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T12:07:00.146 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd1 rbd1 --yes-i-really-really-mean-it 2026-03-24T12:07:01.135 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd1' does not exist 2026-03-24T12:07:01.148 INFO:tasks.workunit.client.0.vm05.stdout:testing removing pool under running tasks... 2026-03-24T12:07:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ test_tasks_removed_pool 2026-03-24T12:07:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing removing pool under running tasks...' 2026-03-24T12:07:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:07:01.148 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.254 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.336 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.420 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.514 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.598 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.685 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.774 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.868 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:01.953 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.037 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.128 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.216 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.309 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.404 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.499 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.591 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.678 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:02.764 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T12:07:03.137 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T12:07:03.150 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T12:07:06.116 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G foo 2026-03-24T12:07:06.156 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create foo@snap 2026-03-24T12:07:07.129 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T12:07:07.138 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect foo@snap 2026-03-24T12:07:07.169 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone foo@snap bar 2026-03-24T12:07:07.221 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G rbd2/dummy 2026-03-24T12:07:07.255 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/dummy 2026-03-24T12:07:07.288 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-24T12:07:08.306 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T12:07:08.306 INFO:tasks.workunit.client.0.vm05.stdout: 1 368 413.793 414 MiB/s 2026-03-24T12:07:09.355 INFO:tasks.workunit.client.0.vm05.stdout: 2 736 379.798 380 MiB/s 2026-03-24T12:07:10.479 INFO:tasks.workunit.client.0.vm05.stdout: 3 992 324.742 325 MiB/s 2026-03-24T12:07:10.645 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 3 ops: 1024 ops/sec: 305.125 bytes/sec: 305 MiB/s 2026-03-24T12:07:10.652 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd2/dummy@snap 2026-03-24T12:07:11.439 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T12:07:11.445 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect rbd2/dummy@snap 2026-03-24T12:07:11.476 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:11.476 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy1 2026-03-24T12:07:11.722 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:11.722 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy2 2026-03-24T12:07:11.768 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:11.769 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy3 2026-03-24T12:07:11.816 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:11.816 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy4 2026-03-24T12:07:11.865 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:11.866 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/dummy@snap rbd2/dummy5 2026-03-24T12:07:11.915 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:07:12.181 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T12:07:12.181 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:12.181 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/dummy1 2026-03-24T12:07:12.827 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 1, "id": "0d590cfe-57cd-4371-95dd-04470487e03f", "message": "Flattening image rbd2/dummy1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy1", "image_id": "321b14b752af"}, "in_progress": true, "progress": 0.03515625} 2026-03-24T12:07:12.859 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:12.859 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/dummy2 2026-03-24T12:07:14.131 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 2, "id": "be2791bd-baf5-494c-8261-cfd75e1dfc92", "message": "Flattening image rbd2/dummy2", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy2", "image_id": "321ef167f56f"}} 2026-03-24T12:07:14.180 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:14.180 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/dummy3 2026-03-24T12:07:15.400 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 3, "id": "6f67010c-c4ab-4783-bde7-4299a311e2e2", "message": "Flattening image rbd2/dummy3", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy3", "image_id": "3221d9495726"}} 2026-03-24T12:07:15.441 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:15.441 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/dummy4 2026-03-24T12:07:16.626 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 4, "id": "54deca04-d31c-4e59-8bf6-307e6bc25af1", "message": "Flattening image rbd2/dummy4", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy4", "image_id": "3224e9dd4ede"}} 2026-03-24T12:07:16.643 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..5} 2026-03-24T12:07:16.643 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/dummy5 2026-03-24T12:07:17.762 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 5, "id": "717c935b-f66f-47c3-8d06-46a3c933ec14", "message": "Flattening image rbd2/dummy5", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "dummy5", "image_id": "3227e76ea13d"}} 2026-03-24T12:07:17.783 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool delete rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T12:07:18.603 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T12:07:18.614 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[ 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "id": "be2791bd-baf5-494c-8261-cfd75e1dfc92", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "message": "Flattening image rbd2/dummy2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "refs": { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "action": "flatten", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_id": "321ef167f56f", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_name": "dummy2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_name": "rbd2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_namespace": "" 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "sequence": 2 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "id": "6f67010c-c4ab-4783-bde7-4299a311e2e2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "message": "Flattening image rbd2/dummy3", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "refs": { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "action": "flatten", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_id": "3221d9495726", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_name": "dummy3", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_name": "rbd2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_namespace": "" 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "sequence": 3 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "id": "54deca04-d31c-4e59-8bf6-307e6bc25af1", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "message": "Flattening image rbd2/dummy4", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "refs": { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "action": "flatten", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_id": "3224e9dd4ede", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "image_name": "dummy4", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_name": "rbd2", 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "pool_namespace": "" 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "sequence": 4 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: { 2026-03-24T12:07:19.197 INFO:tasks.workunit.client.0.vm05.stderr: "id": "717c935b-f66f-47c3-8d06-46a3c933ec14", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "message": "Flattening image rbd2/dummy5", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "refs": { 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "action": "flatten", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "image_id": "3227e76ea13d", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "image_name": "dummy5", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "pool_name": "rbd2", 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "pool_namespace": "" 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: "sequence": 5 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr: } 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr:]' '!=' '[]' 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info bar 2026-03-24T12:07:19.198 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:07:19.299 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd/foo@snap 2026-03-24T12:07:19.299 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail rbd snap unprotect foo@snap 2026-03-24T12:07:19.299 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect foo@snap 2026-03-24T12:07:19.342 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:19.346+0000 7f60b5146640 -1 librbd::SnapshotUnprotectRequest: cannot unprotect: at least 1 child(ren) [320d70ebc9cb] in pool 'rbd' 2026-03-24T12:07:19.342 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:19.346+0000 7f60b5947640 -1 librbd::SnapshotUnprotectRequest: encountered error: (16) Device or resource busy 2026-03-24T12:07:19.342 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:19.346+0000 7f60b5947640 -1 librbd::SnapshotUnprotectRequest: 0x55b1b45cebc0 should_complete_error: ret_val=-16 2026-03-24T12:07:19.345 INFO:tasks.workunit.client.0.vm05.stderr:rbd: unprotecting snap failed: (16) Device or resource busy 2026-03-24T12:07:19.345 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:19.350+0000 7f60b5146640 -1 librbd::SnapshotUnprotectRequest: 0x55b1b45cebc0 should_complete_error: ret_val=-16 2026-03-24T12:07:19.352 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:07:19.352 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten bar 2026-03-24T12:07:19.682 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 6, "id": "9016c20c-654f-4c40-89f4-4883ebe6c130", "message": "Flattening image rbd/bar", "refs": {"action": "flatten", "pool_name": "rbd", "pool_namespace": "", "image_name": "bar", "image_id": "320d70ebc9cb"}, "retry_attempts": 1, "retry_time": "2026-03-24T12:07:49.621703"} 2026-03-24T12:07:19.696 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..12} 2026-03-24T12:07:19.696 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info bar 2026-03-24T12:07:19.696 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:07:19.725 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:07:19.725 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info bar 2026-03-24T12:07:19.725 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'parent: ' 2026-03-24T12:07:19.726 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:07:19.753 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:07:19.753 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect foo@snap 2026-03-24T12:07:19.797 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..12} 2026-03-24T12:07:19.798 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:07:20.052 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T12:07:20.052 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:07:20.052 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:07:20.315 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T12:07:20.315 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:07:20.315 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.405 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.496 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.589 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.682 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.775 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.870 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:20.962 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:21.054 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:21.209 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:21.612 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:21.703 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:21.872 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.162 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.250 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.340 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.423 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.510 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.597 INFO:tasks.workunit.client.0.vm05.stderr:+ test_tasks_recovery 2026-03-24T12:07:22.597 INFO:tasks.workunit.client.0.vm05.stderr:+ echo 'testing task handler recovery after module'\''s RADOS client is blocklisted...' 2026-03-24T12:07:22.597 INFO:tasks.workunit.client.0.vm05.stdout:testing task handler recovery after module's RADOS client is blocklisted... 2026-03-24T12:07:22.597 INFO:tasks.workunit.client.0.vm05.stderr:+ remove_images 2026-03-24T12:07:22.597 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.682 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:22.765 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.057 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.140 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.227 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.317 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.404 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.490 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.574 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.656 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.747 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.835 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:23.921 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:24.007 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:24.091 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:24.179 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:24.267 INFO:tasks.workunit.client.0.vm05.stderr:+ for img in $IMGS 2026-03-24T12:07:24.351 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool create rbd2 8 2026-03-24T12:07:25.530 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' already exists 2026-03-24T12:07:25.543 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd pool init rbd2 2026-03-24T12:07:28.496 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd create --image-format 2 --size 1G rbd2/img1 2026-03-24T12:07:28.529 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd bench --io-type write --io-pattern seq --io-size 1M --io-total 1G rbd2/img1 2026-03-24T12:07:28.560 INFO:tasks.workunit.client.0.vm05.stdout:bench type write io_size 1048576 io_threads 16 bytes 1073741824 pattern sequential 2026-03-24T12:07:29.618 INFO:tasks.workunit.client.0.vm05.stdout: SEC OPS OPS/SEC BYTES/SEC 2026-03-24T12:07:29.618 INFO:tasks.workunit.client.0.vm05.stdout: 1 224 239.044 239 MiB/s 2026-03-24T12:07:30.570 INFO:tasks.workunit.client.0.vm05.stdout: 2 512 269.939 270 MiB/s 2026-03-24T12:07:30.735 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:30.742+0000 7f33c3847640 0 -- 192.168.123.105:0/1965303367 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x559bccb85c70 msgr2=0x7f33a0080400 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T12:07:31.581 INFO:tasks.workunit.client.0.vm05.stdout: 3 816 280.323 280 MiB/s 2026-03-24T12:07:31.696 INFO:tasks.workunit.client.0.vm05.stderr:2026-03-24T12:07:31.702+0000 7f33c3847640 0 -- 192.168.123.105:0/1965303367 >> [v2:192.168.123.105:6808/3270659984,v1:192.168.123.105:6809/3270659984] conn(0x559bccb85c70 msgr2=0x7f33a00818c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until injecting socket failure 2026-03-24T12:07:32.416 INFO:tasks.workunit.client.0.vm05.stdout:elapsed: 3 ops: 1024 ops/sec: 265.56 bytes/sec: 266 MiB/s 2026-03-24T12:07:32.424 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap create rbd2/img1@snap 2026-03-24T12:07:32.510 INFO:tasks.workunit.client.0.vm05.stderr: Creating snap: 10% complete... Creating snap: 100% complete...done. 2026-03-24T12:07:32.518 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap protect rbd2/img1@snap 2026-03-24T12:07:32.550 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd clone rbd2/img1@snap rbd2/clone1 2026-03-24T12:07:32.597 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph mgr dump 2026-03-24T12:07:32.598 INFO:tasks.workunit.client.0.vm05.stderr:++ jq 'select(.name == "rbd_support")' 2026-03-24T12:07:32.598 INFO:tasks.workunit.client.0.vm05.stderr:++ jq '.active_clients[]' 2026-03-24T12:07:32.598 INFO:tasks.workunit.client.0.vm05.stderr:++ jq -r '[.addrvec[0].addr, "/", .addrvec[0].nonce|tostring] | add' 2026-03-24T12:07:32.884 INFO:tasks.workunit.client.0.vm05.stderr:+ CLIENT_ADDR=192.168.123.105:0/423666768 2026-03-24T12:07:32.884 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd blocklist add 192.168.123.105:0/423666768 2026-03-24T12:07:34.512 INFO:tasks.workunit.client.0.vm05.stderr:blocklisting 192.168.123.105:0/423666768 until 2026-03-24T13:07:33.584344+0000 (3600 sec) 2026-03-24T12:07:34.527 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail ceph rbd task add flatten rbd2/clone1 2026-03-24T12:07:34.527 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:07:34.707 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:07:34.714+0000 7f1d14709640 -1 librbd::image::OpenRequest: failed to stat v2 image header: (108) Cannot send after transport endpoint shutdown 2026-03-24T12:07:34.708 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:07:34.714+0000 7f1d14709640 -1 librbd::ImageState: 0x556e1b0e7400 failed to open image: (108) Cannot send after transport endpoint shutdown 2026-03-24T12:07:34.708 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:07:34.714+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable [errno 108] RBD connection was shutdown (error opening image b'clone1' at snapshot None) 2026-03-24T12:07:34.708 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: [errno 108] RBD connection was shutdown (error opening image b'clone1' at snapshot None) 2026-03-24T12:07:34.715 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:07:34.715 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:07:44.717 INFO:tasks.workunit.client.0.vm05.stderr:++ seq 24 2026-03-24T12:07:44.718 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:07:44.718 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:07:44.905 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:07:44.910+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:07:44.906 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T12:07:44.911 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:07:54.912 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:07:54.912 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:07:55.094 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:07:55.098+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:07:55.094 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T12:07:55.099 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:08:05.100 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:08:05.101 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:08:05.277 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:08:05.282+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:08:05.277 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T12:08:05.282 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:08:15.283 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:08:15.283 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:08:15.456 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:08:15.462+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:08:15.457 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T12:08:15.461 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:08:25.462 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:08:25.462 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:08:25.634 INFO:tasks.ceph.mgr.x.vm05.stderr:2026-03-24T12:08:25.638+0000 7f1d10701640 -1 mgr.server reply reply (11) Resource temporarily unavailable rbd_support module is not ready, try again 2026-03-24T12:08:25.634 INFO:tasks.workunit.client.0.vm05.stderr:Error EAGAIN: rbd_support module is not ready, try again 2026-03-24T12:08:25.639 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:08:35.640 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in `seq 24` 2026-03-24T12:08:35.640 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph rbd task add flatten rbd2/clone1 2026-03-24T12:08:36.192 INFO:tasks.workunit.client.0.vm05.stdout:{"sequence": 1, "id": "cc3e64d7-ac12-490c-ad6d-336e5c18d425", "message": "Flattening image rbd2/clone1", "refs": {"action": "flatten", "pool_name": "rbd2", "pool_namespace": "", "image_name": "clone1", "image_id": "333ac9ad31ee"}, "in_progress": true, "progress": 0.03515625} 2026-03-24T12:08:36.212 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:08:36.213 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[ 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: { 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "id": "cc3e64d7-ac12-490c-ad6d-336e5c18d425", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "in_progress": true, 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "message": "Flattening image rbd2/clone1", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "progress": 0.17578125, 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "refs": { 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "action": "flatten", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "image_id": "333ac9ad31ee", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "image_name": "clone1", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "pool_name": "rbd2", 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "pool_namespace": "" 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: }, 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: "sequence": 1 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr: } 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr:]' '!=' '[]' 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..12} 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info rbd2/clone1 2026-03-24T12:08:36.580 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:08:36.835 INFO:tasks.workunit.client.0.vm05.stdout: parent: rbd2/img1@snap 2026-03-24T12:08:36.835 INFO:tasks.workunit.client.0.vm05.stderr:+ sleep 10 2026-03-24T12:08:46.837 INFO:tasks.workunit.client.0.vm05.stderr:+ for i in {1..12} 2026-03-24T12:08:46.837 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info rbd2/clone1 2026-03-24T12:08:46.837 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:08:46.863 INFO:tasks.workunit.client.0.vm05.stderr:+ break 2026-03-24T12:08:46.863 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd info rbd2/clone1 2026-03-24T12:08:46.863 INFO:tasks.workunit.client.0.vm05.stderr:+ expect_fail grep 'parent: ' 2026-03-24T12:08:46.863 INFO:tasks.workunit.client.0.vm05.stderr:+ grep 'parent: ' 2026-03-24T12:08:46.889 INFO:tasks.workunit.client.0.vm05.stderr:+ return 0 2026-03-24T12:08:46.889 INFO:tasks.workunit.client.0.vm05.stderr:+ rbd snap unprotect rbd2/img1@snap 2026-03-24T12:08:46.919 INFO:tasks.workunit.client.0.vm05.stderr:++ ceph rbd task list 2026-03-24T12:08:47.168 INFO:tasks.workunit.client.0.vm05.stderr:+ test '[]' = '[]' 2026-03-24T12:08:47.168 INFO:tasks.workunit.client.0.vm05.stderr:+ ceph osd pool rm rbd2 rbd2 --yes-i-really-really-mean-it 2026-03-24T12:08:47.659 INFO:tasks.workunit.client.0.vm05.stderr:pool 'rbd2' does not exist 2026-03-24T12:08:47.686 INFO:tasks.workunit.client.0.vm05.stdout:OK 2026-03-24T12:08:47.687 INFO:tasks.workunit.client.0.vm05.stderr:+ echo OK 2026-03-24T12:08:47.687 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-24T12:08:47.687 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-24T12:08:47.720 INFO:tasks.workunit:Stopping ['rbd/cli_generic.sh'] on client.0... 2026-03-24T12:08:47.720 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-24T12:08:48.437 DEBUG:teuthology.parallel:result is None 2026-03-24T12:08:48.437 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T12:08:48.444 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T12:08:48.444 DEBUG:teuthology.orchestra.run.vm05:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-24T12:08:48.494 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-24T12:08:48.494 DEBUG:teuthology.run_tasks:Unwinding manager ceph 2026-03-24T12:08:48.496 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-24T12:08:48.496 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T12:08:48.700 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:08:48.700 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T12:08:48.713 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":2653,"stamp":"2026-03-24T12:08:47.564925+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84491,"num_read_kb":234593,"num_write":43531,"num_write_kb":8895353,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":47194,"ondisk_log_size":47194,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":4710748,"kb_used_data":2103724,"kb_used_omap":1293,"kb_used_meta":2605682,"kb_avail":278404772,"statfs":{"total":289910292480,"available":285086486528,"internally_reserved":0,"allocated":2154213376,"data_stored":4298700066,"data_compressed":35135504,"data_compressed_allocated":2147926016,"data_compressed_original":4295852032,"omap_allocated":1324554,"internal_metadata":2668218870},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":44,"apply_latency_ms":44,"commit_latency_ns":44000000,"apply_latency_ns":44000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742169,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1560,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.561984"},"pg_stats":[{"pgid":"2.7","version":"232'5268","reported_seq":9824,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933713+0000","last_change":"2026-03-24T12:07:21.476270+0000","last_active":"2026-03-24T12:07:35.933713+0000","last_peered":"2026-03-24T12:07:35.933713+0000","last_clean":"2026-03-24T12:07:35.933713+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T12:07:35.933713+0000","last_undegraded":"2026-03-24T12:07:35.933713+0000","last_fullsized":"2026-03-24T12:07:35.933713+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5268,"log_dups_size":0,"ondisk_log_size":5268,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:47:47.184319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00090827900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8686,"num_read_kb":25791,"num_write":4782,"num_write_kb":1233241,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"232'5793","reported_seq":9406,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933694+0000","last_change":"2026-03-24T12:07:21.479247+0000","last_active":"2026-03-24T12:07:35.933694+0000","last_peered":"2026-03-24T12:07:35.933694+0000","last_clean":"2026-03-24T12:07:35.933694+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T12:07:35.933694+0000","last_undegraded":"2026-03-24T12:07:35.933694+0000","last_fullsized":"2026-03-24T12:07:35.933694+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5793,"log_dups_size":0,"ondisk_log_size":5793,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:50:29.496857+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0038736769999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7787,"num_read_kb":26709,"num_write":4797,"num_write_kb":1090629,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"232'4794","reported_seq":8189,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933977+0000","last_change":"2026-03-24T12:07:21.480019+0000","last_active":"2026-03-24T12:07:35.933977+0000","last_peered":"2026-03-24T12:07:35.933977+0000","last_clean":"2026-03-24T12:07:35.933977+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T12:07:35.933977+0000","last_undegraded":"2026-03-24T12:07:35.933977+0000","last_fullsized":"2026-03-24T12:07:35.933977+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4794,"log_dups_size":0,"ondisk_log_size":4794,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T19:11:31.198223+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0037852200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5400,"num_read_kb":37023,"num_write":4256,"num_write_kb":1133997,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"232'6740","reported_seq":12445,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933649+0000","last_change":"2026-03-24T12:07:21.479955+0000","last_active":"2026-03-24T12:07:35.933649+0000","last_peered":"2026-03-24T12:07:35.933649+0000","last_clean":"2026-03-24T12:07:35.933649+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T12:07:35.933649+0000","last_undegraded":"2026-03-24T12:07:35.933649+0000","last_fullsized":"2026-03-24T12:07:35.933649+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6740,"log_dups_size":0,"ondisk_log_size":6740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:52:30.046595+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0039362920000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10594,"num_read_kb":24480,"num_write":5034,"num_write_kb":1092765,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"232'7856","reported_seq":12999,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:36.007642+0000","last_change":"2026-03-24T12:07:21.480604+0000","last_active":"2026-03-24T12:07:36.007642+0000","last_peered":"2026-03-24T12:07:36.007642+0000","last_clean":"2026-03-24T12:07:36.007642+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T12:07:36.007642+0000","last_undegraded":"2026-03-24T12:07:36.007642+0000","last_fullsized":"2026-03-24T12:07:36.007642+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":7856,"log_dups_size":0,"ondisk_log_size":7856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:18:26.868949+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00022347799999999999,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":27616,"num_read_kb":42333,"num_write":10790,"num_write_kb":1162620,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"232'4710","reported_seq":8468,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:08:31.349126+0000","last_change":"2026-03-24T12:07:21.473704+0000","last_active":"2026-03-24T12:08:31.349126+0000","last_peered":"2026-03-24T12:08:31.349126+0000","last_clean":"2026-03-24T12:08:31.349126+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T12:08:31.349126+0000","last_undegraded":"2026-03-24T12:08:31.349126+0000","last_fullsized":"2026-03-24T12:08:31.349126+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4710,"log_dups_size":0,"ondisk_log_size":4710,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:45:53.747925+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039478700000000001,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":6368,"num_read_kb":23686,"num_write":4292,"num_write_kb":1081521,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"232'5398","reported_seq":9476,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.484199+0000","last_change":"2026-03-24T12:07:21.473777+0000","last_active":"2026-03-24T12:07:35.484199+0000","last_peered":"2026-03-24T12:07:35.484199+0000","last_clean":"2026-03-24T12:07:35.484199+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T12:07:35.484199+0000","last_undegraded":"2026-03-24T12:07:35.484199+0000","last_fullsized":"2026-03-24T12:07:35.484199+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5398,"log_dups_size":0,"ondisk_log_size":5398,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:27:45.399809+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000404526,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7747,"num_read_kb":25651,"num_write":4200,"num_write_kb":1069536,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"232'6603","reported_seq":13103,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933639+0000","last_change":"2026-03-24T12:07:21.476398+0000","last_active":"2026-03-24T12:07:35.933639+0000","last_peered":"2026-03-24T12:07:35.933639+0000","last_clean":"2026-03-24T12:07:35.933639+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T12:07:35.933639+0000","last_undegraded":"2026-03-24T12:07:35.933639+0000","last_fullsized":"2026-03-24T12:07:35.933639+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6603,"log_dups_size":0,"ondisk_log_size":6603,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:58:45.552605+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038145399999999998,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":10247,"num_read_kb":28883,"num_write":5323,"num_write_kb":1030460,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":554,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933954+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T12:07:35.933954+0000","last_peered":"2026-03-24T12:07:35.933954+0000","last_clean":"2026-03-24T12:07:35.933954+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T12:07:35.933954+0000","last_undegraded":"2026-03-24T12:07:35.933954+0000","last_fullsized":"2026-03-24T12:07:35.933954+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84445,"num_read_kb":234556,"num_write":43474,"num_write_kb":8894769,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1170335,"internal_metadata":0},"log_size":47162,"ondisk_log_size":47162,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739337,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1390200,"kb_used_data":569320,"kb_used_omap":339,"kb_used_meta":820524,"kb_avail":92981640,"statfs":{"total":96636764160,"available":95213199360,"internally_reserved":0,"allocated":582983680,"data_stored":1162760242,"data_compressed":9502208,"data_compressed_allocated":580911104,"data_compressed_original":1161822208,"omap_allocated":347206,"internal_metadata":840217530},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":11,"apply_latency_ms":11,"commit_latency_ns":11000000,"apply_latency_ns":11000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739338,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1376620,"kb_used_data":483548,"kb_used_omap":561,"kb_used_meta":892494,"kb_avail":92995220,"statfs":{"total":96636764160,"available":95227105280,"internally_reserved":0,"allocated":495153152,"data_stored":987058702,"data_compressed":8065544,"data_compressed_allocated":493051904,"data_compressed_original":986103808,"omap_allocated":575432,"internal_metadata":913913912},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":16,"apply_latency_ms":16,"commit_latency_ns":16000000,"apply_latency_ns":16000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739335,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1943928,"kb_used_data":1050856,"kb_used_omap":392,"kb_used_meta":892663,"kb_avail":92427912,"statfs":{"total":96636764160,"available":94646181888,"internally_reserved":0,"allocated":1076076544,"data_stored":2148881122,"data_compressed":17567752,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":401916,"internal_metadata":914087428},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":17,"apply_latency_ms":17,"commit_latency_ns":17000000,"apply_latency_ns":17000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":355046,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":538958,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":276331,"internal_metadata":0}]}} 2026-03-24T12:08:48.713 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T12:08:48.864 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:08:48.864 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T12:08:48.877 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":2653,"stamp":"2026-03-24T12:08:47.564925+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84491,"num_read_kb":234593,"num_write":43531,"num_write_kb":8895353,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":47194,"ondisk_log_size":47194,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":4710748,"kb_used_data":2103724,"kb_used_omap":1293,"kb_used_meta":2605682,"kb_avail":278404772,"statfs":{"total":289910292480,"available":285086486528,"internally_reserved":0,"allocated":2154213376,"data_stored":4298700066,"data_compressed":35135504,"data_compressed_allocated":2147926016,"data_compressed_original":4295852032,"omap_allocated":1324554,"internal_metadata":2668218870},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":44,"apply_latency_ms":44,"commit_latency_ns":44000000,"apply_latency_ns":44000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742169,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1560,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.561984"},"pg_stats":[{"pgid":"2.7","version":"232'5268","reported_seq":9824,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933713+0000","last_change":"2026-03-24T12:07:21.476270+0000","last_active":"2026-03-24T12:07:35.933713+0000","last_peered":"2026-03-24T12:07:35.933713+0000","last_clean":"2026-03-24T12:07:35.933713+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T12:07:35.933713+0000","last_undegraded":"2026-03-24T12:07:35.933713+0000","last_fullsized":"2026-03-24T12:07:35.933713+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5268,"log_dups_size":0,"ondisk_log_size":5268,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:47:47.184319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00090827900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8686,"num_read_kb":25791,"num_write":4782,"num_write_kb":1233241,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"232'5793","reported_seq":9406,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933694+0000","last_change":"2026-03-24T12:07:21.479247+0000","last_active":"2026-03-24T12:07:35.933694+0000","last_peered":"2026-03-24T12:07:35.933694+0000","last_clean":"2026-03-24T12:07:35.933694+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T12:07:35.933694+0000","last_undegraded":"2026-03-24T12:07:35.933694+0000","last_fullsized":"2026-03-24T12:07:35.933694+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5793,"log_dups_size":0,"ondisk_log_size":5793,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:50:29.496857+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0038736769999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7787,"num_read_kb":26709,"num_write":4797,"num_write_kb":1090629,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"232'4794","reported_seq":8189,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933977+0000","last_change":"2026-03-24T12:07:21.480019+0000","last_active":"2026-03-24T12:07:35.933977+0000","last_peered":"2026-03-24T12:07:35.933977+0000","last_clean":"2026-03-24T12:07:35.933977+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T12:07:35.933977+0000","last_undegraded":"2026-03-24T12:07:35.933977+0000","last_fullsized":"2026-03-24T12:07:35.933977+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4794,"log_dups_size":0,"ondisk_log_size":4794,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T19:11:31.198223+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0037852200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5400,"num_read_kb":37023,"num_write":4256,"num_write_kb":1133997,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"232'6740","reported_seq":12445,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933649+0000","last_change":"2026-03-24T12:07:21.479955+0000","last_active":"2026-03-24T12:07:35.933649+0000","last_peered":"2026-03-24T12:07:35.933649+0000","last_clean":"2026-03-24T12:07:35.933649+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T12:07:35.933649+0000","last_undegraded":"2026-03-24T12:07:35.933649+0000","last_fullsized":"2026-03-24T12:07:35.933649+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6740,"log_dups_size":0,"ondisk_log_size":6740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:52:30.046595+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0039362920000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10594,"num_read_kb":24480,"num_write":5034,"num_write_kb":1092765,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"232'7856","reported_seq":12999,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:36.007642+0000","last_change":"2026-03-24T12:07:21.480604+0000","last_active":"2026-03-24T12:07:36.007642+0000","last_peered":"2026-03-24T12:07:36.007642+0000","last_clean":"2026-03-24T12:07:36.007642+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T12:07:36.007642+0000","last_undegraded":"2026-03-24T12:07:36.007642+0000","last_fullsized":"2026-03-24T12:07:36.007642+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":7856,"log_dups_size":0,"ondisk_log_size":7856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:18:26.868949+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00022347799999999999,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":27616,"num_read_kb":42333,"num_write":10790,"num_write_kb":1162620,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"232'4710","reported_seq":8468,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:08:31.349126+0000","last_change":"2026-03-24T12:07:21.473704+0000","last_active":"2026-03-24T12:08:31.349126+0000","last_peered":"2026-03-24T12:08:31.349126+0000","last_clean":"2026-03-24T12:08:31.349126+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T12:08:31.349126+0000","last_undegraded":"2026-03-24T12:08:31.349126+0000","last_fullsized":"2026-03-24T12:08:31.349126+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4710,"log_dups_size":0,"ondisk_log_size":4710,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:45:53.747925+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039478700000000001,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":6368,"num_read_kb":23686,"num_write":4292,"num_write_kb":1081521,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"232'5398","reported_seq":9476,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.484199+0000","last_change":"2026-03-24T12:07:21.473777+0000","last_active":"2026-03-24T12:07:35.484199+0000","last_peered":"2026-03-24T12:07:35.484199+0000","last_clean":"2026-03-24T12:07:35.484199+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T12:07:35.484199+0000","last_undegraded":"2026-03-24T12:07:35.484199+0000","last_fullsized":"2026-03-24T12:07:35.484199+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5398,"log_dups_size":0,"ondisk_log_size":5398,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:27:45.399809+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000404526,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7747,"num_read_kb":25651,"num_write":4200,"num_write_kb":1069536,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"232'6603","reported_seq":13103,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933639+0000","last_change":"2026-03-24T12:07:21.476398+0000","last_active":"2026-03-24T12:07:35.933639+0000","last_peered":"2026-03-24T12:07:35.933639+0000","last_clean":"2026-03-24T12:07:35.933639+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T12:07:35.933639+0000","last_undegraded":"2026-03-24T12:07:35.933639+0000","last_fullsized":"2026-03-24T12:07:35.933639+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6603,"log_dups_size":0,"ondisk_log_size":6603,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:58:45.552605+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038145399999999998,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":10247,"num_read_kb":28883,"num_write":5323,"num_write_kb":1030460,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":554,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933954+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T12:07:35.933954+0000","last_peered":"2026-03-24T12:07:35.933954+0000","last_clean":"2026-03-24T12:07:35.933954+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T12:07:35.933954+0000","last_undegraded":"2026-03-24T12:07:35.933954+0000","last_fullsized":"2026-03-24T12:07:35.933954+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84445,"num_read_kb":234556,"num_write":43474,"num_write_kb":8894769,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1170335,"internal_metadata":0},"log_size":47162,"ondisk_log_size":47162,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739337,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1390200,"kb_used_data":569320,"kb_used_omap":339,"kb_used_meta":820524,"kb_avail":92981640,"statfs":{"total":96636764160,"available":95213199360,"internally_reserved":0,"allocated":582983680,"data_stored":1162760242,"data_compressed":9502208,"data_compressed_allocated":580911104,"data_compressed_original":1161822208,"omap_allocated":347206,"internal_metadata":840217530},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":11,"apply_latency_ms":11,"commit_latency_ns":11000000,"apply_latency_ns":11000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739338,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1376620,"kb_used_data":483548,"kb_used_omap":561,"kb_used_meta":892494,"kb_avail":92995220,"statfs":{"total":96636764160,"available":95227105280,"internally_reserved":0,"allocated":495153152,"data_stored":987058702,"data_compressed":8065544,"data_compressed_allocated":493051904,"data_compressed_original":986103808,"omap_allocated":575432,"internal_metadata":913913912},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":16,"apply_latency_ms":16,"commit_latency_ns":16000000,"apply_latency_ns":16000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739335,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1943928,"kb_used_data":1050856,"kb_used_omap":392,"kb_used_meta":892663,"kb_avail":92427912,"statfs":{"total":96636764160,"available":94646181888,"internally_reserved":0,"allocated":1076076544,"data_stored":2148881122,"data_compressed":17567752,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":401916,"internal_metadata":914087428},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":17,"apply_latency_ms":17,"commit_latency_ns":17000000,"apply_latency_ns":17000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":355046,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":538958,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":276331,"internal_metadata":0}]}} 2026-03-24T12:08:48.877 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-24T12:08:48.877 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T12:08:49.028 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:08:49.029 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T12:08:49.041 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":2654,"stamp":"2026-03-24T12:08:49.004409+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84491,"num_read_kb":234593,"num_write":43531,"num_write_kb":8895353,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":47194,"ondisk_log_size":47194,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":4161508,"kb_used_data":1536428,"kb_used_omap":1292,"kb_used_meta":2623731,"kb_avail":278954012,"statfs":{"total":289910292480,"available":285648908288,"internally_reserved":0,"allocated":1573302272,"data_stored":3136881712,"data_compressed":25633296,"data_compressed_allocated":1567014912,"data_compressed_original":3134029824,"omap_allocated":1323801,"internal_metadata":2686700775},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":79,"apply_latency_ms":79,"commit_latency_ns":79000000,"apply_latency_ns":79000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1073742169,"num_objects":-267,"num_object_clones":0,"num_object_copies":-534,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-267,"num_whiteouts":0,"num_read":-1766,"num_read_kb":-1466,"num_write":-1560,"num_write_kb":-1049105,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-4,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001277"},"pg_stats":[{"pgid":"2.7","version":"232'5268","reported_seq":9824,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933713+0000","last_change":"2026-03-24T12:07:21.476270+0000","last_active":"2026-03-24T12:07:35.933713+0000","last_peered":"2026-03-24T12:07:35.933713+0000","last_clean":"2026-03-24T12:07:35.933713+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T12:07:35.933713+0000","last_undegraded":"2026-03-24T12:07:35.933713+0000","last_fullsized":"2026-03-24T12:07:35.933713+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5268,"log_dups_size":0,"ondisk_log_size":5268,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:47:47.184319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00090827900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8686,"num_read_kb":25791,"num_write":4782,"num_write_kb":1233241,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"232'5793","reported_seq":9406,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933694+0000","last_change":"2026-03-24T12:07:21.479247+0000","last_active":"2026-03-24T12:07:35.933694+0000","last_peered":"2026-03-24T12:07:35.933694+0000","last_clean":"2026-03-24T12:07:35.933694+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T12:07:35.933694+0000","last_undegraded":"2026-03-24T12:07:35.933694+0000","last_fullsized":"2026-03-24T12:07:35.933694+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5793,"log_dups_size":0,"ondisk_log_size":5793,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:50:29.496857+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0038736769999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7787,"num_read_kb":26709,"num_write":4797,"num_write_kb":1090629,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"232'4794","reported_seq":8189,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933977+0000","last_change":"2026-03-24T12:07:21.480019+0000","last_active":"2026-03-24T12:07:35.933977+0000","last_peered":"2026-03-24T12:07:35.933977+0000","last_clean":"2026-03-24T12:07:35.933977+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T12:07:35.933977+0000","last_undegraded":"2026-03-24T12:07:35.933977+0000","last_fullsized":"2026-03-24T12:07:35.933977+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4794,"log_dups_size":0,"ondisk_log_size":4794,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T19:11:31.198223+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0037852200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5400,"num_read_kb":37023,"num_write":4256,"num_write_kb":1133997,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"232'6740","reported_seq":12445,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933649+0000","last_change":"2026-03-24T12:07:21.479955+0000","last_active":"2026-03-24T12:07:35.933649+0000","last_peered":"2026-03-24T12:07:35.933649+0000","last_clean":"2026-03-24T12:07:35.933649+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T12:07:35.933649+0000","last_undegraded":"2026-03-24T12:07:35.933649+0000","last_fullsized":"2026-03-24T12:07:35.933649+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6740,"log_dups_size":0,"ondisk_log_size":6740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:52:30.046595+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0039362920000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10594,"num_read_kb":24480,"num_write":5034,"num_write_kb":1092765,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"232'7856","reported_seq":12999,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:36.007642+0000","last_change":"2026-03-24T12:07:21.480604+0000","last_active":"2026-03-24T12:07:36.007642+0000","last_peered":"2026-03-24T12:07:36.007642+0000","last_clean":"2026-03-24T12:07:36.007642+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T12:07:36.007642+0000","last_undegraded":"2026-03-24T12:07:36.007642+0000","last_fullsized":"2026-03-24T12:07:36.007642+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":7856,"log_dups_size":0,"ondisk_log_size":7856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:18:26.868949+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00022347799999999999,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":27616,"num_read_kb":42333,"num_write":10790,"num_write_kb":1162620,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"232'4710","reported_seq":8470,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567339+0000","last_change":"2026-03-24T12:07:21.473704+0000","last_active":"2026-03-24T12:08:47.567339+0000","last_peered":"2026-03-24T12:08:47.567339+0000","last_clean":"2026-03-24T12:08:47.567339+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T12:08:47.567339+0000","last_undegraded":"2026-03-24T12:08:47.567339+0000","last_fullsized":"2026-03-24T12:08:47.567339+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4710,"log_dups_size":0,"ondisk_log_size":4710,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:45:53.747925+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039478700000000001,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":6368,"num_read_kb":23686,"num_write":4292,"num_write_kb":1081521,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"232'5398","reported_seq":9478,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567335+0000","last_change":"2026-03-24T12:07:21.473777+0000","last_active":"2026-03-24T12:08:47.567335+0000","last_peered":"2026-03-24T12:08:47.567335+0000","last_clean":"2026-03-24T12:08:47.567335+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T12:08:47.567335+0000","last_undegraded":"2026-03-24T12:08:47.567335+0000","last_fullsized":"2026-03-24T12:08:47.567335+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5398,"log_dups_size":0,"ondisk_log_size":5398,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:27:45.399809+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000404526,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7747,"num_read_kb":25651,"num_write":4200,"num_write_kb":1069536,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"232'6603","reported_seq":13103,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933639+0000","last_change":"2026-03-24T12:07:21.476398+0000","last_active":"2026-03-24T12:07:35.933639+0000","last_peered":"2026-03-24T12:07:35.933639+0000","last_clean":"2026-03-24T12:07:35.933639+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T12:07:35.933639+0000","last_undegraded":"2026-03-24T12:07:35.933639+0000","last_fullsized":"2026-03-24T12:07:35.933639+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6603,"log_dups_size":0,"ondisk_log_size":6603,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:58:45.552605+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038145399999999998,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":10247,"num_read_kb":28883,"num_write":5323,"num_write_kb":1030460,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":554,"reported_epoch":241,"state":"active+clean","last_fresh":"2026-03-24T12:07:35.933954+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T12:07:35.933954+0000","last_peered":"2026-03-24T12:07:35.933954+0000","last_clean":"2026-03-24T12:07:35.933954+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T12:07:35.933954+0000","last_undegraded":"2026-03-24T12:07:35.933954+0000","last_fullsized":"2026-03-24T12:07:35.933954+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84445,"num_read_kb":234556,"num_write":43474,"num_write_kb":8894769,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1168776,"internal_metadata":0},"log_size":47162,"ondisk_log_size":47162,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739338,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":840960,"kb_used_data":2024,"kb_used_omap":338,"kb_used_meta":838573,"kb_avail":93530880,"statfs":{"total":96636764160,"available":95775621120,"internally_reserved":0,"allocated":2072576,"data_stored":941888,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":346453,"internal_metadata":858699435},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":46,"apply_latency_ms":46,"commit_latency_ns":46000000,"apply_latency_ns":46000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739338,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1376620,"kb_used_data":483548,"kb_used_omap":561,"kb_used_meta":892494,"kb_avail":92995220,"statfs":{"total":96636764160,"available":95227105280,"internally_reserved":0,"allocated":495153152,"data_stored":987058702,"data_compressed":8065544,"data_compressed_allocated":493051904,"data_compressed_original":986103808,"omap_allocated":575432,"internal_metadata":913913912},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":16,"apply_latency_ms":16,"commit_latency_ns":16000000,"apply_latency_ns":16000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739335,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1943928,"kb_used_data":1050856,"kb_used_omap":392,"kb_used_meta":892663,"kb_avail":92427912,"statfs":{"total":96636764160,"available":94646181888,"internally_reserved":0,"allocated":1076076544,"data_stored":2148881122,"data_compressed":17567752,"data_compressed_allocated":1073963008,"data_compressed_original":2147926016,"omap_allocated":401916,"internal_metadata":914087428},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[0,0,2,0,2,2,1],"upper_bound":128},"perf_stat":{"commit_latency_ms":17,"apply_latency_ms":17,"commit_latency_ns":17000000,"apply_latency_ns":17000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":355046,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":538958,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":274772,"internal_metadata":0}]}} 2026-03-24T12:08:49.041 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-24T12:08:49.191 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:08:49.192 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":242,"fsid":"a0c8ea99-5654-4097-bf98-f2a2e799bc82","created":"2026-03-24T10:48:14.032373+0000","modified":"2026-03-24T12:08:47.559916+0000","last_up_change":"2026-03-24T10:48:18.049402+0000","last_in_change":"2026-03-24T10:48:14.710778+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":17,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-24T10:48:18.414195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-24T10:48:22.140928+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"232","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":50,"snap_epoch":232,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"3022b391-fa44-4bc1-b17f-aae1f26aed61","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":233,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6809","nonce":3270659984}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6811","nonce":3270659984}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6815","nonce":3270659984}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":3270659984},{"type":"v1","addr":"192.168.123.105:6813","nonce":3270659984}]},"public_addr":"192.168.123.105:6809/3270659984","cluster_addr":"192.168.123.105:6811/3270659984","heartbeat_back_addr":"192.168.123.105:6815/3270659984","heartbeat_front_addr":"192.168.123.105:6813/3270659984","state":["exists","up"]},{"osd":1,"uuid":"25e05012-9142-46e5-8250-a717d8c70af9","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":233,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6801","nonce":4104923970}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6803","nonce":4104923970}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6807","nonce":4104923970}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":4104923970},{"type":"v1","addr":"192.168.123.105:6805","nonce":4104923970}]},"public_addr":"192.168.123.105:6801/4104923970","cluster_addr":"192.168.123.105:6803/4104923970","heartbeat_back_addr":"192.168.123.105:6807/4104923970","heartbeat_front_addr":"192.168.123.105:6805/4104923970","state":["exists","up"]},{"osd":2,"uuid":"fc499d35-830c-40c8-b3d2-d0fdaa9d8a65","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":224,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6817","nonce":951022638}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6819","nonce":951022638}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6823","nonce":951022638}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":951022638},{"type":"v1","addr":"192.168.123.105:6821","nonce":951022638}]},"public_addr":"192.168.123.105:6817/951022638","cluster_addr":"192.168.123.105:6819/951022638","heartbeat_back_addr":"192.168.123.105:6823/951022638","heartbeat_front_addr":"192.168.123.105:6821/951022638","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T10:48:16.730148+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T10:48:16.630722+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4544132024016699391,"old_weight":0,"last_purged_snaps_scrub":"2026-03-24T10:48:16.745746+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/1389144683":"2026-03-24T13:05:59.784865+0000","192.168.123.105:0/423666768":"2026-03-24T13:07:33.066414+0000","192.168.123.105:0/157535646":"2026-03-24T13:03:26.316292+0000","192.168.123.105:0/4260664610":"2026-03-24T12:59:11.867214+0000","192.168.123.105:0/4215727306":"2026-03-24T12:53:16.078087+0000","192.168.123.105:0/4074493522":"2026-03-24T12:53:15.093337+0000","192.168.123.105:0/3166531645":"2026-03-24T12:53:14.221216+0000","192.168.123.105:0/61730382":"2026-03-24T12:53:13.814077+0000","192.168.123.105:0/3764798314":"2026-03-24T12:37:36.957937+0000","192.168.123.105:0/777967554":"2026-03-24T12:37:36.489202+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":3,"snaps":[{"begin":2,"length":2}]},{"pool":4,"snaps":[{"begin":2,"length":2}]},{"pool":6,"snaps":[{"begin":2,"length":2}]},{"pool":10,"snaps":[{"begin":3,"length":1}]},{"pool":15,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-24T12:08:50.205 INFO:tasks.ceph:Scrubbing osd.0 2026-03-24T12:08:50.205 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 config set osd_debug_deep_scrub_sleep 0 2026-03-24T12:08:50.280 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-24T12:08:50.280 INFO:teuthology.orchestra.run.vm05.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T12:08:50.280 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-24T12:08:50.290 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 0 2026-03-24T12:08:50.443 INFO:teuthology.orchestra.run.vm05.stderr:instructed osd(s) 0 to deep-scrub 2026-03-24T12:08:50.456 INFO:tasks.ceph:Scrubbing osd.1 2026-03-24T12:08:50.456 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 config set osd_debug_deep_scrub_sleep 0 2026-03-24T12:08:50.530 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-24T12:08:50.530 INFO:teuthology.orchestra.run.vm05.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T12:08:50.530 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-24T12:08:50.540 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 1 2026-03-24T12:08:50.698 INFO:teuthology.orchestra.run.vm05.stderr:instructed osd(s) 1 to deep-scrub 2026-03-24T12:08:50.711 INFO:tasks.ceph:Scrubbing osd.2 2026-03-24T12:08:50.711 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 config set osd_debug_deep_scrub_sleep 0 2026-03-24T12:08:50.786 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-24T12:08:50.786 INFO:teuthology.orchestra.run.vm05.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-24T12:08:50.786 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-24T12:08:50.796 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 2 2026-03-24T12:08:50.950 INFO:teuthology.orchestra.run.vm05.stderr:instructed osd(s) 2 to deep-scrub 2026-03-24T12:08:50.963 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T12:08:51.119 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:08:51.119 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T12:08:51.132 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":2655,"stamp":"2026-03-24T12:08:51.004636+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84491,"num_read_kb":234593,"num_write":43531,"num_write_kb":8895353,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":47194,"ondisk_log_size":47194,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":34,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":2815972,"kb_used_data":190884,"kb_used_omap":1257,"kb_used_meta":2623766,"kb_avail":280299548,"statfs":{"total":289910292480,"available":287026737152,"internally_reserved":0,"allocated":195465216,"data_stored":381231736,"data_compressed":3095568,"data_compressed_allocated":189186048,"data_compressed_original":378372096,"omap_allocated":1287198,"internal_metadata":2686737378},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":142,"apply_latency_ms":142,"commit_latency_ns":142000000,"apply_latency_ns":142000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":-1811939673,"num_objects":-444,"num_object_clones":0,"num_object_copies":-888,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":-444,"num_whiteouts":0,"num_read":-3469,"num_read_kb":-764439,"num_write":-2278,"num_write_kb":-1770365,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":-5,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001271"},"pg_stats":[{"pgid":"2.7","version":"232'5268","reported_seq":9826,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567403+0000","last_change":"2026-03-24T12:07:21.476270+0000","last_active":"2026-03-24T12:08:47.567403+0000","last_peered":"2026-03-24T12:08:47.567403+0000","last_clean":"2026-03-24T12:08:47.567403+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T12:08:47.567403+0000","last_undegraded":"2026-03-24T12:08:47.567403+0000","last_fullsized":"2026-03-24T12:08:47.567403+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5268,"log_dups_size":0,"ondisk_log_size":5268,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:47:47.184319+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00090827900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8686,"num_read_kb":25791,"num_write":4782,"num_write_kb":1233241,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"232'5793","reported_seq":9408,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567432+0000","last_change":"2026-03-24T12:07:21.479247+0000","last_active":"2026-03-24T12:08:47.567432+0000","last_peered":"2026-03-24T12:08:47.567432+0000","last_clean":"2026-03-24T12:08:47.567432+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T12:08:47.567432+0000","last_undegraded":"2026-03-24T12:08:47.567432+0000","last_fullsized":"2026-03-24T12:08:47.567432+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5793,"log_dups_size":0,"ondisk_log_size":5793,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:50:29.496857+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0038736769999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7787,"num_read_kb":26709,"num_write":4797,"num_write_kb":1090629,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"232'4794","reported_seq":8191,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567455+0000","last_change":"2026-03-24T12:07:21.480019+0000","last_active":"2026-03-24T12:08:47.567455+0000","last_peered":"2026-03-24T12:08:47.567455+0000","last_clean":"2026-03-24T12:08:47.567455+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T12:08:47.567455+0000","last_undegraded":"2026-03-24T12:08:47.567455+0000","last_fullsized":"2026-03-24T12:08:47.567455+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4794,"log_dups_size":0,"ondisk_log_size":4794,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T19:11:31.198223+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0037852200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5400,"num_read_kb":37023,"num_write":4256,"num_write_kb":1133997,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"232'6740","reported_seq":12447,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567478+0000","last_change":"2026-03-24T12:07:21.479955+0000","last_active":"2026-03-24T12:08:47.567478+0000","last_peered":"2026-03-24T12:08:47.567478+0000","last_clean":"2026-03-24T12:08:47.567478+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T12:08:47.567478+0000","last_undegraded":"2026-03-24T12:08:47.567478+0000","last_fullsized":"2026-03-24T12:08:47.567478+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6740,"log_dups_size":0,"ondisk_log_size":6740,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:52:30.046595+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.0039362920000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10594,"num_read_kb":24480,"num_write":5034,"num_write_kb":1092765,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"232'7856","reported_seq":13001,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.566997+0000","last_change":"2026-03-24T12:07:21.480604+0000","last_active":"2026-03-24T12:08:47.566997+0000","last_peered":"2026-03-24T12:08:47.566997+0000","last_clean":"2026-03-24T12:08:47.566997+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T12:08:47.566997+0000","last_undegraded":"2026-03-24T12:08:47.566997+0000","last_fullsized":"2026-03-24T12:08:47.566997+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":7856,"log_dups_size":0,"ondisk_log_size":7856,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:18:26.868949+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00022347799999999999,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":27616,"num_read_kb":42333,"num_write":10790,"num_write_kb":1162620,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"232'4710","reported_seq":8470,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567339+0000","last_change":"2026-03-24T12:07:21.473704+0000","last_active":"2026-03-24T12:08:47.567339+0000","last_peered":"2026-03-24T12:08:47.567339+0000","last_clean":"2026-03-24T12:08:47.567339+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T12:08:47.567339+0000","last_undegraded":"2026-03-24T12:08:47.567339+0000","last_fullsized":"2026-03-24T12:08:47.567339+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":4710,"log_dups_size":0,"ondisk_log_size":4710,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:45:53.747925+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00039478700000000001,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":6368,"num_read_kb":23686,"num_write":4292,"num_write_kb":1081521,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"232'5398","reported_seq":9478,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567335+0000","last_change":"2026-03-24T12:07:21.473777+0000","last_active":"2026-03-24T12:08:47.567335+0000","last_peered":"2026-03-24T12:08:47.567335+0000","last_clean":"2026-03-24T12:08:47.567335+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T12:08:47.567335+0000","last_undegraded":"2026-03-24T12:08:47.567335+0000","last_fullsized":"2026-03-24T12:08:47.567335+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":5398,"log_dups_size":0,"ondisk_log_size":5398,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:27:45.399809+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.000404526,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7747,"num_read_kb":25651,"num_write":4200,"num_write_kb":1069536,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"232'6603","reported_seq":13105,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567501+0000","last_change":"2026-03-24T12:07:21.476398+0000","last_active":"2026-03-24T12:08:47.567501+0000","last_peered":"2026-03-24T12:08:47.567501+0000","last_clean":"2026-03-24T12:08:47.567501+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T12:08:47.567501+0000","last_undegraded":"2026-03-24T12:08:47.567501+0000","last_fullsized":"2026-03-24T12:08:47.567501+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:22.415988+0000","last_clean_scrub_stamp":"2026-03-24T10:48:22.415988+0000","objects_scrubbed":0,"log_size":6603,"log_dups_size":0,"ondisk_log_size":6603,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T13:58:45.552605+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038145399999999998,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":10247,"num_read_kb":28883,"num_write":5323,"num_write_kb":1030460,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"10'32","reported_seq":556,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:47.567531+0000","last_change":"2026-03-24T10:48:20.416109+0000","last_active":"2026-03-24T12:08:47.567531+0000","last_peered":"2026-03-24T12:08:47.567531+0000","last_clean":"2026-03-24T12:08:47.567531+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T12:08:47.567531+0000","last_undegraded":"2026-03-24T12:08:47.567531+0000","last_fullsized":"2026-03-24T12:08:47.567531+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-24T10:48:19.407878+0000","last_clean_scrub_stamp":"2026-03-24T10:48:19.407878+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:59:59.945556+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84445,"num_read_kb":234556,"num_write":43474,"num_write_kb":8894769,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1167030,"internal_metadata":0},"log_size":47162,"ondisk_log_size":47162,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739338,"num_pgs":7,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":840960,"kb_used_data":2024,"kb_used_omap":338,"kb_used_meta":838573,"kb_avail":93530880,"statfs":{"total":96636764160,"available":95775621120,"internally_reserved":0,"allocated":2072576,"data_stored":941888,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":346453,"internal_metadata":858699435},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":46,"apply_latency_ms":46,"commit_latency_ns":46000000,"apply_latency_ns":46000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739339,"num_pgs":13,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":895340,"kb_used_data":2268,"kb_used_omap":563,"kb_used_meta":892492,"kb_avail":93476500,"statfs":{"total":96636764160,"available":95719936000,"internally_reserved":0,"allocated":2322432,"data_stored":1401195,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":577480,"internal_metadata":913911864},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":40,"apply_latency_ms":40,"commit_latency_ns":40000000,"apply_latency_ns":40000000},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739336,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":1079672,"kb_used_data":186592,"kb_used_omap":354,"kb_used_meta":892701,"kb_avail":93292168,"statfs":{"total":96636764160,"available":95531180032,"internally_reserved":0,"allocated":191070208,"data_stored":378888653,"data_compressed":3091464,"data_compressed_allocated":188964864,"data_compressed_original":377929728,"omap_allocated":363265,"internal_metadata":914126079},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":56,"apply_latency_ms":56,"commit_latency_ns":56000000,"apply_latency_ns":56000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":353196,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":539062,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":274772,"internal_metadata":0}]}} 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.7 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.6 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.5 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.4 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.2 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.1 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.0 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 2.3 last_scrub_stamp 2026-03-24T10:48:22.415988+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=22, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:pgid 1.0 last_scrub_stamp 2026-03-24T10:48:19.407878+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=10, tm_min=48, tm_sec=19, tm_wday=1, tm_yday=83, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=24, tm_hour=12, tm_min=8, tm_sec=49, tm_wday=1, tm_yday=83, tm_isdst=0) 2026-03-24T12:08:51.134 INFO:tasks.ceph:Still waiting for all pgs to be scrubbed. 2026-03-24T12:09:11.134 DEBUG:teuthology.orchestra.run.vm05:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-24T12:09:11.296 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-24T12:09:11.296 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-24T12:09:11.312 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":2665,"stamp":"2026-03-24T12:09:11.007056+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459307,"num_objects":9,"num_object_clones":0,"num_object_copies":18,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":9,"num_whiteouts":0,"num_read":84491,"num_read_kb":234593,"num_write":43531,"num_write_kb":8895353,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":47203,"ondisk_log_size":47203,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":2631632,"kb_used_data":6560,"kb_used_omap":1257,"kb_used_meta":2623766,"kb_avail":280483888,"statfs":{"total":289910292480,"available":287215501312,"internally_reserved":0,"allocated":6717440,"data_stored":3744278,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":1287251,"internal_metadata":2686737325},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"12.001436"},"pg_stats":[{"pgid":"2.7","version":"232'5268","reported_seq":9834,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:55.827875+0000","last_change":"2026-03-24T12:08:55.827875+0000","last_active":"2026-03-24T12:08:55.827875+0000","last_peered":"2026-03-24T12:08:55.827875+0000","last_clean":"2026-03-24T12:08:55.827875+0000","last_became_active":"2026-03-24T10:48:23.431584+0000","last_became_peered":"2026-03-24T10:48:23.431584+0000","last_unstale":"2026-03-24T12:08:55.827875+0000","last_undegraded":"2026-03-24T12:08:55.827875+0000","last_fullsized":"2026-03-24T12:08:55.827875+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"232'5268","last_scrub_stamp":"2026-03-24T12:08:55.827824+0000","last_deep_scrub":"232'5268","last_deep_scrub_stamp":"2026-03-24T12:08:55.827824+0000","last_clean_scrub_stamp":"2026-03-24T12:08:55.827824+0000","objects_scrubbed":0,"log_size":5268,"log_dups_size":0,"ondisk_log_size":5268,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T15:47:49.547942+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.00090827900000000001,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":8686,"num_read_kb":25791,"num_write":4782,"num_write_kb":1233241,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"232'5793","reported_seq":9416,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:54.795203+0000","last_change":"2026-03-24T12:08:54.795203+0000","last_active":"2026-03-24T12:08:54.795203+0000","last_peered":"2026-03-24T12:08:54.795203+0000","last_clean":"2026-03-24T12:08:54.795203+0000","last_became_active":"2026-03-24T10:48:23.430333+0000","last_became_peered":"2026-03-24T10:48:23.430333+0000","last_unstale":"2026-03-24T12:08:54.795203+0000","last_undegraded":"2026-03-24T12:08:54.795203+0000","last_fullsized":"2026-03-24T12:08:54.795203+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"232'5793","last_scrub_stamp":"2026-03-24T12:08:54.795144+0000","last_deep_scrub":"232'5793","last_deep_scrub_stamp":"2026-03-24T12:08:54.795144+0000","last_clean_scrub_stamp":"2026-03-24T12:08:54.795144+0000","objects_scrubbed":0,"log_size":5793,"log_dups_size":0,"ondisk_log_size":5793,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T18:17:04.226557+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.0038736769999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7787,"num_read_kb":26709,"num_write":4797,"num_write_kb":1090629,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"232'4794","reported_seq":8199,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:53.765436+0000","last_change":"2026-03-24T12:08:53.765436+0000","last_active":"2026-03-24T12:08:53.765436+0000","last_peered":"2026-03-24T12:08:53.765436+0000","last_clean":"2026-03-24T12:08:53.765436+0000","last_became_active":"2026-03-24T10:48:23.431659+0000","last_became_peered":"2026-03-24T10:48:23.431659+0000","last_unstale":"2026-03-24T12:08:53.765436+0000","last_undegraded":"2026-03-24T12:08:53.765436+0000","last_fullsized":"2026-03-24T12:08:53.765436+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"232'4794","last_scrub_stamp":"2026-03-24T12:08:53.765273+0000","last_deep_scrub":"232'4794","last_deep_scrub_stamp":"2026-03-24T12:08:53.765273+0000","last_clean_scrub_stamp":"2026-03-24T12:08:53.765273+0000","objects_scrubbed":0,"log_size":4794,"log_dups_size":0,"ondisk_log_size":4794,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T16:47:50.435836+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.0037852200000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":5400,"num_read_kb":37023,"num_write":4256,"num_write_kb":1133997,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.4","version":"242'6742","reported_seq":12457,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:51.730817+0000","last_change":"2026-03-24T12:08:51.730817+0000","last_active":"2026-03-24T12:08:51.730817+0000","last_peered":"2026-03-24T12:08:51.730817+0000","last_clean":"2026-03-24T12:08:51.730817+0000","last_became_active":"2026-03-24T10:48:23.430350+0000","last_became_peered":"2026-03-24T10:48:23.430350+0000","last_unstale":"2026-03-24T12:08:51.730817+0000","last_undegraded":"2026-03-24T12:08:51.730817+0000","last_fullsized":"2026-03-24T12:08:51.730817+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"242'6742","last_scrub_stamp":"2026-03-24T12:08:51.730784+0000","last_deep_scrub":"242'6742","last_deep_scrub_stamp":"2026-03-24T12:08:51.730784+0000","last_clean_scrub_stamp":"2026-03-24T12:08:51.730784+0000","objects_scrubbed":2,"log_size":6742,"log_dups_size":0,"ondisk_log_size":6742,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T21:07:45.387466+0000","scrub_duration":13,"objects_trimmed":0,"snaptrim_duration":0.0039362920000000001,"stat_sum":{"num_bytes":8,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":10594,"num_read_kb":24480,"num_write":5034,"num_write_kb":1092765,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.2","version":"242'7858","reported_seq":13011,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:51.319643+0000","last_change":"2026-03-24T12:08:51.319643+0000","last_active":"2026-03-24T12:08:51.319643+0000","last_peered":"2026-03-24T12:08:51.319643+0000","last_clean":"2026-03-24T12:08:51.319643+0000","last_became_active":"2026-03-24T10:48:23.429577+0000","last_became_peered":"2026-03-24T10:48:23.429577+0000","last_unstale":"2026-03-24T12:08:51.319643+0000","last_undegraded":"2026-03-24T12:08:51.319643+0000","last_fullsized":"2026-03-24T12:08:51.319643+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"242'7858","last_scrub_stamp":"2026-03-24T12:08:51.319606+0000","last_deep_scrub":"242'7858","last_deep_scrub_stamp":"2026-03-24T12:08:51.319606+0000","last_clean_scrub_stamp":"2026-03-24T12:08:51.319606+0000","objects_scrubbed":2,"log_size":7858,"log_dups_size":0,"ondisk_log_size":7858,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T14:41:23.338346+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00022347799999999999,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":27616,"num_read_kb":42333,"num_write":10790,"num_write_kb":1162620,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.1","version":"242'4712","reported_seq":8480,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:52.754842+0000","last_change":"2026-03-24T12:08:52.754842+0000","last_active":"2026-03-24T12:08:52.754842+0000","last_peered":"2026-03-24T12:08:52.754842+0000","last_clean":"2026-03-24T12:08:52.754842+0000","last_became_active":"2026-03-24T10:48:23.429702+0000","last_became_peered":"2026-03-24T10:48:23.429702+0000","last_unstale":"2026-03-24T12:08:52.754842+0000","last_undegraded":"2026-03-24T12:08:52.754842+0000","last_fullsized":"2026-03-24T12:08:52.754842+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"242'4712","last_scrub_stamp":"2026-03-24T12:08:52.754759+0000","last_deep_scrub":"242'4712","last_deep_scrub_stamp":"2026-03-24T12:08:52.754759+0000","last_clean_scrub_stamp":"2026-03-24T12:08:52.754759+0000","objects_scrubbed":2,"log_size":4712,"log_dups_size":0,"ondisk_log_size":4712,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T15:55:11.797648+0000","scrub_duration":9,"objects_trimmed":0,"snaptrim_duration":0.00039478700000000001,"stat_sum":{"num_bytes":0,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":6368,"num_read_kb":23686,"num_write":4292,"num_write_kb":1081521,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":2,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"232'5398","reported_seq":9486,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:51.737881+0000","last_change":"2026-03-24T12:08:51.737881+0000","last_active":"2026-03-24T12:08:51.737881+0000","last_peered":"2026-03-24T12:08:51.737881+0000","last_clean":"2026-03-24T12:08:51.737881+0000","last_became_active":"2026-03-24T10:48:23.429898+0000","last_became_peered":"2026-03-24T10:48:23.429898+0000","last_unstale":"2026-03-24T12:08:51.737881+0000","last_undegraded":"2026-03-24T12:08:51.737881+0000","last_fullsized":"2026-03-24T12:08:51.737881+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"232'5398","last_scrub_stamp":"2026-03-24T12:08:51.737847+0000","last_deep_scrub":"232'5398","last_deep_scrub_stamp":"2026-03-24T12:08:51.737847+0000","last_clean_scrub_stamp":"2026-03-24T12:08:51.737847+0000","objects_scrubbed":0,"log_size":5398,"log_dups_size":0,"ondisk_log_size":5398,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T17:39:03.775134+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0.000404526,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":7747,"num_read_kb":25651,"num_write":4200,"num_write_kb":1069536,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.3","version":"242'6604","reported_seq":13114,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:52.754269+0000","last_change":"2026-03-24T12:08:52.754269+0000","last_active":"2026-03-24T12:08:52.754269+0000","last_peered":"2026-03-24T12:08:52.754269+0000","last_clean":"2026-03-24T12:08:52.754269+0000","last_became_active":"2026-03-24T10:48:23.429112+0000","last_became_peered":"2026-03-24T10:48:23.429112+0000","last_unstale":"2026-03-24T12:08:52.754269+0000","last_undegraded":"2026-03-24T12:08:52.754269+0000","last_fullsized":"2026-03-24T12:08:52.754269+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"242'6604","last_scrub_stamp":"2026-03-24T12:08:52.754187+0000","last_deep_scrub":"242'6604","last_deep_scrub_stamp":"2026-03-24T12:08:52.754187+0000","last_clean_scrub_stamp":"2026-03-24T12:08:52.754187+0000","objects_scrubbed":1,"log_size":6604,"log_dups_size":0,"ondisk_log_size":6604,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T19:06:16.331900+0000","scrub_duration":13,"objects_trimmed":0,"snaptrim_duration":0.00038145399999999998,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":10247,"num_read_kb":28883,"num_write":5323,"num_write_kb":1030460,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":1,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"1.0","version":"242'34","reported_seq":566,"reported_epoch":242,"state":"active+clean","last_fresh":"2026-03-24T12:08:50.729227+0000","last_change":"2026-03-24T12:08:50.729227+0000","last_active":"2026-03-24T12:08:50.729227+0000","last_peered":"2026-03-24T12:08:50.729227+0000","last_clean":"2026-03-24T12:08:50.729227+0000","last_became_active":"2026-03-24T10:48:20.415531+0000","last_became_peered":"2026-03-24T10:48:20.415531+0000","last_unstale":"2026-03-24T12:08:50.729227+0000","last_undegraded":"2026-03-24T12:08:50.729227+0000","last_fullsized":"2026-03-24T12:08:50.729227+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"242'34","last_scrub_stamp":"2026-03-24T12:08:50.729176+0000","last_deep_scrub":"242'34","last_deep_scrub_stamp":"2026-03-24T12:08:50.729176+0000","last_clean_scrub_stamp":"2026-03-24T12:08:50.729176+0000","objects_scrubbed":2,"log_size":34,"log_dups_size":0,"ondisk_log_size":34,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-03-25T12:53:36.738407+0000","scrub_duration":5,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":27,"num_objects":7,"num_object_clones":0,"num_object_copies":14,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":7,"num_whiteouts":0,"num_read":84445,"num_read_kb":234556,"num_write":43474,"num_write_kb":8894769,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":7,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":16384,"data_stored":54,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1167030,"internal_metadata":0},"log_size":47169,"ondisk_log_size":47169,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":483328,"data_stored":918560,"data_compressed":8208,"data_compressed_allocated":442368,"data_compressed_original":884736,"omap_allocated":0,"internal_metadata":0},"log_size":34,"ondisk_log_size":34,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359739342,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":840952,"kb_used_data":2024,"kb_used_omap":338,"kb_used_meta":838573,"kb_avail":93530888,"statfs":{"total":96636764160,"available":95775629312,"internally_reserved":0,"allocated":2072576,"data_stored":941888,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":346453,"internal_metadata":858699435},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359739343,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":895340,"kb_used_data":2268,"kb_used_omap":563,"kb_used_meta":892492,"kb_avail":93476500,"statfs":{"total":96636764160,"available":95719936000,"internally_reserved":0,"allocated":2322432,"data_stored":1401195,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":577480,"internal_metadata":913911864},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359739340,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":895340,"kb_used_data":2268,"kb_used_omap":354,"kb_used_meta":892701,"kb_avail":93476500,"statfs":{"total":96636764160,"available":95719936000,"internally_reserved":0,"allocated":2322432,"data_stored":1401195,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":363318,"internal_metadata":914126026},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":241664,"data_stored":459280,"data_compressed":4104,"data_compressed_allocated":221184,"data_compressed_original":442368,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":353196,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":27,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":539062,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":274772,"internal_metadata":0}]}} 2026-03-24T12:09:11.312 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --cluster ceph config set global mon_health_to_clog false 2026-03-24T12:09:11.507 INFO:teuthology.misc:Shutting down mds daemons... 2026-03-24T12:09:11.507 INFO:teuthology.misc:Shutting down osd daemons... 2026-03-24T12:09:11.507 DEBUG:tasks.ceph.osd.0:waiting for process to exit 2026-03-24T12:09:11.507 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T12:09:11.599 INFO:tasks.ceph.osd.0:Stopped 2026-03-24T12:09:11.599 DEBUG:tasks.ceph.osd.1:waiting for process to exit 2026-03-24T12:09:11.599 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T12:09:11.695 INFO:tasks.ceph.osd.1:Stopped 2026-03-24T12:09:11.695 DEBUG:tasks.ceph.osd.2:waiting for process to exit 2026-03-24T12:09:11.695 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T12:09:11.771 INFO:tasks.ceph.osd.2:Stopped 2026-03-24T12:09:11.771 INFO:teuthology.misc:Shutting down mgr daemons... 2026-03-24T12:09:11.771 DEBUG:tasks.ceph.mgr.x:waiting for process to exit 2026-03-24T12:09:11.771 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T12:09:11.802 INFO:tasks.ceph.mgr.x:Stopped 2026-03-24T12:09:11.802 INFO:teuthology.misc:Shutting down mon daemons... 2026-03-24T12:09:11.803 DEBUG:tasks.ceph.mon.a:waiting for process to exit 2026-03-24T12:09:11.803 INFO:teuthology.orchestra.run:waiting for 300 2026-03-24T12:09:11.860 INFO:tasks.ceph.mon.a:Stopped 2026-03-24T12:09:11.860 INFO:tasks.ceph:Checking cluster log for badness... 2026-03-24T12:09:11.860 DEBUG:teuthology.orchestra.run.vm05:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v '\(OSD_SLOW_PING_TIME' | head -n 1 2026-03-24T12:09:11.918 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-0 on ubuntu@vm05.local 2026-03-24T12:09:11.918 DEBUG:teuthology.orchestra.run.vm05:> sync && sudo umount -f /var/lib/ceph/osd/ceph-0 2026-03-24T12:09:12.019 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-1 on ubuntu@vm05.local 2026-03-24T12:09:12.019 DEBUG:teuthology.orchestra.run.vm05:> sync && sudo umount -f /var/lib/ceph/osd/ceph-1 2026-03-24T12:09:12.067 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-2 on ubuntu@vm05.local 2026-03-24T12:09:12.067 DEBUG:teuthology.orchestra.run.vm05:> sync && sudo umount -f /var/lib/ceph/osd/ceph-2 2026-03-24T12:09:12.117 INFO:tasks.ceph:Archiving mon data... 2026-03-24T12:09:12.117 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/lib/ceph/mon/ceph-a to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589/data/mon.a.tgz 2026-03-24T12:09:12.117 DEBUG:teuthology.orchestra.run.vm05:> mktemp 2026-03-24T12:09:12.120 INFO:teuthology.orchestra.run.vm05.stdout:/tmp/tmp.gntl7jxRa5 2026-03-24T12:09:12.120 DEBUG:teuthology.orchestra.run.vm05:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-a -- . > /tmp/tmp.gntl7jxRa5 2026-03-24T12:09:12.231 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0666 /tmp/tmp.gntl7jxRa5 2026-03-24T12:09:12.287 DEBUG:teuthology.orchestra.remote:vm05:/tmp/tmp.gntl7jxRa5 is 485KB 2026-03-24T12:09:12.335 DEBUG:teuthology.orchestra.run.vm05:> rm -fr /tmp/tmp.gntl7jxRa5 2026-03-24T12:09:12.339 INFO:tasks.ceph:Cleaning ceph cluster... 2026-03-24T12:09:12.339 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /etc/ceph/ceph.conf /etc/ceph/ceph.keyring /home/ubuntu/cephtest/ceph.data /home/ubuntu/cephtest/ceph.monmap /home/ubuntu/cephtest/../*.pid 2026-03-24T12:09:12.438 INFO:teuthology.util.scanner:summary_data or yaml_file is empty! 2026-03-24T12:09:12.438 INFO:tasks.ceph:Archiving crash dumps... 2026-03-24T12:09:12.439 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/lib/ceph/crash to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589/remote/vm05/crash 2026-03-24T12:09:12.439 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/lib/ceph/crash -- . 2026-03-24T12:09:12.490 INFO:tasks.ceph:Compressing logs... 2026-03-24T12:09:12.490 DEBUG:teuthology.orchestra.run.vm05:> time sudo find /var/log/ceph -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-24T12:09:12.499 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82994.log 2026-03-24T12:09:12.499 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54546.log 2026-03-24T12:09:12.499 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82994.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72487.log 2026-03-24T12:09:12.500 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82994.log.gz 2026-03-24T12:09:12.500 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29209.log 2026-03-24T12:09:12.500 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54546.log.gz 2026-03-24T12:09:12.500 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72487.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38780.log 2026-03-24T12:09:12.500 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72487.log.gz 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29209.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54675.log 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29209.log.gz 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38780.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82929.log 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54675.log: 26.3% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54675.log.gz -- replaced with /var/log/ceph/ceph-client.admin.38780.log.gz 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27986.log 2026-03-24T12:09:12.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82929.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63838.log 2026-03-24T12:09:12.502 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82929.log.gz 2026-03-24T12:09:12.502 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44354.log 2026-03-24T12:09:12.502 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27986.log.gz 2026-03-24T12:09:12.502 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63838.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58866.log 2026-03-24T12:09:12.502 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63838.log.gz 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44354.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36705.log 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58866.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86102.log 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58866.log.gz 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.44354.log.gz 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36705.log.gz 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84823.log 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59222.log 2026-03-24T12:09:12.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84823.log: /var/log/ceph/ceph-client.admin.86102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84823.log.gz 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86102.log.gz 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89650.log 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59222.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77010.log 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59222.log.gz 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89650.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37132.log 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.89650.log.gz 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77010.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32427.log 2026-03-24T12:09:12.504 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77010.log.gz 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37132.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73735.log 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.37132.log.gz 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32427.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64425.log 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32427.log.gz 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73735.log.gz 2026-03-24T12:09:12.505 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43892.log 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64425.log.gz 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75614.log 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43892.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55975.log 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43892.log.gz 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75614.log.gz 2026-03-24T12:09:12.506 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36484.log 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55975.log.gz 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81971.log 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80141.log 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36484.log.gz 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81971.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81971.log.gz 2026-03-24T12:09:12.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54238.log 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80141.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.71610.log -- replaced with /var/log/ceph/ceph-client.admin.80141.log.gz 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54238.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71868.log 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71610.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.71610.log.gz 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.62263.log 2026-03-24T12:09:12.508 INFO:teuthology.orchestra.run.vm05.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.54238.log.gz 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71868.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23021.log 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71868.log.gz 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66152.log 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62263.log.gz 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.23021.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66624.log 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.23021.log.gz 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66152.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66152.log.gz 2026-03-24T12:09:12.509 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89693.log 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66624.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30521.log 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66624.log.gz 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69031.log 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89693.log.gz 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67776.log 2026-03-24T12:09:12.510 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30521.log.gz 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69031.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85930.log 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.69031.log.gz 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47840.log 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67776.log.gz 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42607.log 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85930.log.gz 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47840.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30500.log 2026-03-24T12:09:12.511 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47840.log.gz 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42607.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59871.log 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42607.log.gz 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30500.log.gz 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45890.log 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57141.log 2026-03-24T12:09:12.512 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59871.log.gz 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45890.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72528.log 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45890.log.gz 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57141.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.57141.log.gz -5 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.40107.log 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72528.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.60334.log 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.72528.log.gz 2026-03-24T12:09:12.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40107.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58327.log 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40107.log.gz 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60334.log.gz 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39726.log 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58327.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.58327.log.gz --verbose 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.85629.log 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39726.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47383.log 2026-03-24T12:09:12.514 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39726.log.gz 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85629.log.gz 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56732.log 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47383.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62634.log 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47383.log.gz 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56732.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31257.log 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56732.log.gz 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62634.log.gz 2026-03-24T12:09:12.515 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56861.log 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31257.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76019.log 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31257.log.gz 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56861.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61001.log 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56861.log.gz 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76019.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.76019.log.gz -5 2026-03-24T12:09:12.516 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.48068.log 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61001.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48420.log 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61001.log.gz 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48068.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77726.log 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48068.log.gz 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48420.log.gz 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41126.log 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77726.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62226.log 2026-03-24T12:09:12.517 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77726.log.gz 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73852.log 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41126.log.gz 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62226.log.gz 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65894.log 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73852.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65485.log 2026-03-24T12:09:12.518 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73852.log.gz 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71373.log 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65894.log: /var/log/ceph/ceph-client.admin.65485.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65485.log.gz 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65894.log.gz 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29531.log 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71373.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90187.log 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71373.log.gz 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29531.log.gz 2026-03-24T12:09:12.519 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47679.log 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90187.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35770.log 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90187.log.gz 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47679.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54337.log 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47679.log.gz 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35770.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39425.log 2026-03-24T12:09:12.520 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35770.log.gz 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54337.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33612.log 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54337.log.gz 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39425.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52707.log 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39425.log.gz 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33612.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28819.log 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33612.log.gz 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91126.log 2026-03-24T12:09:12.521 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52707.log.gz 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28819.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58262.log 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28819.log.gz 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91126.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51079.log 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91126.log.gz 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58262.log.gzgzip 2026-03-24T12:09:12.522 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.28759.log 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51079.log.gz 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62965.log 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47039.log 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28759.log.gz 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62965.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88321.log 2026-03-24T12:09:12.523 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62965.log.gz 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47039.log: gzip -5 --verbose -- /var/log/ceph/ceph.audit.log 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47039.log.gz 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88321.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.76427.log -- replaced with /var/log/ceph/ceph-client.admin.88321.log.gz 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph.audit.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86577.log 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76427.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87330.log 2026-03-24T12:09:12.524 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76427.log.gz 2026-03-24T12:09:12.526 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86577.log.gz 2026-03-24T12:09:12.526 INFO:teuthology.orchestra.run.vm05.stderr: 89.9% -- replaced with /var/log/ceph/ceph.audit.log.gz 2026-03-24T12:09:12.526 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85167.log 2026-03-24T12:09:12.526 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87330.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90832.log 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87330.log.gz 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85167.log.gz 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71674.log 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90832.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71138.log 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90832.log.gz 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37723.log 2026-03-24T12:09:12.527 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71674.log.gz 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71138.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82027.log 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.71138.log.gz 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37723.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90531.log 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37723.log.gz 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82027.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73069.log 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82027.log.gz 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90531.log.gz 2026-03-24T12:09:12.528 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80380.log 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73069.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85586.log 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73069.log.gz 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73869.log 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80380.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80380.log.gz/var/log/ceph/ceph-client.admin.85586.log: 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85586.log.gz 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47340.log 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73869.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56883.log 2026-03-24T12:09:12.529 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73869.log.gz 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47340.log.gz 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34472.log 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56883.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80724.log 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56883.log.gz 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34472.log.gz 2026-03-24T12:09:12.530 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26793.log 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80724.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80724.log.gzgzip 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.91636.log 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67540.log 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26793.log.gz 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91636.log.gz 2026-03-24T12:09:12.531 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68335.log 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67540.log.gz 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72328.log 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68335.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41678.log 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68335.log.gz 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72328.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90811.log 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.72328.log.gz 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41678.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.41678.log.gz --verbose 2026-03-24T12:09:12.532 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.65786.log 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90811.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63497.log 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90811.log.gz 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65786.log.gz 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40090.log 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63497.log: gzip -5 0.0% --verbose -- replaced with /var/log/ceph/ceph-client.admin.63497.log.gz -- 2026-03-24T12:09:12.533 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.59785.log 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40090.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81543.log 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59785.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40090.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59785.log.gz 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86706.log 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81543.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48294.log 2026-03-24T12:09:12.534 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81543.log.gz 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53116.log 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86706.log.gz 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83556.log 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48294.log.gz 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53116.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33892.log 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53116.log.gz 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83556.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.83556.log.gz -- 2026-03-24T12:09:12.535 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.81694.log 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33892.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78478.log 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33892.log.gz 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81694.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91004.log 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81694.log.gz 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78478.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80681.log 2026-03-24T12:09:12.536 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78478.log.gz 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91004.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47779.log 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91004.log.gz 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80681.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89908.log 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80681.log.gz 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47779.log.gz 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27879.log 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89908.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76666.log 2026-03-24T12:09:12.537 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89908.log.gz 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27879.log.gz 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45697.log 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76666.log.gz 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54398.log 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45697.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55911.log 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45697.log.gz 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54398.log.gz 2026-03-24T12:09:12.538 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65306.log 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55911.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38917.log 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55911.log.gz 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65306.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68099.log 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65306.log.gz 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38917.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26856.log 2026-03-24T12:09:12.539 INFO:teuthology.orchestra.run.vm05.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38917.log.gz 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68099.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.68099.log.gz --verbose 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.89972.log 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26856.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45049.log 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53559.log 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.89972.log.gz 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26856.log.gz 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45049.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37011.log 2026-03-24T12:09:12.540 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45049.log.gz 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71118.log 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53559.log: /var/log/ceph/ceph-client.admin.37011.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67733.log 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53559.log.gz 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37011.log.gz 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71118.log.gz 2026-03-24T12:09:12.541 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46369.log 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67733.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56947.log 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67733.log.gz 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46369.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84608.log 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46369.log.gz 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56947.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56947.log.gz 2026-03-24T12:09:12.542 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49502.log 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59728.log 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84608.log.gz 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27092.log 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49502.log.gz/var/log/ceph/ceph-client.admin.59728.log: 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59728.log.gz 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61474.log 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27092.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33092.log 2026-03-24T12:09:12.543 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27092.log.gz 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61474.log.gz 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29337.log 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33092.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68971.log 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29337.log.gz 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62905.log 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33092.log.gz 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68971.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35804.log 2026-03-24T12:09:12.544 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68971.log.gz 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62905.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62905.log.gz 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76623.log 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56926.log 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35804.log.gz 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80960.log 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76623.log.gz 2026-03-24T12:09:12.545 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56926.log.gz 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41947.log 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80960.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80960.log.gz 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68399.log 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41947.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85081.log 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41947.log.gz 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68399.log.gz 2026-03-24T12:09:12.546 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52664.log 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85081.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85081.log.gz 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48318.log 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45654.log 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52664.log.gz 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48318.log.gz 2026-03-24T12:09:12.547 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66419.log 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45654.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25554.log 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45654.log.gz 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67088.log 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66419.log.gz 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25554.log.gz 2026-03-24T12:09:12.548 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83678.log 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67088.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61668.log 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67088.log.gz 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83678.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44154.log 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83678.log.gz 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61668.log.gz 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42365.log 2026-03-24T12:09:12.549 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44154.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65050.log 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.44154.log.gz 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60528.log 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42365.log.gz 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65050.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42483.log 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65050.log.gz 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53780.log 2026-03-24T12:09:12.550 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60528.log: /var/log/ceph/ceph-client.admin.42483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60528.log.gz 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42483.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40419.log 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53780.log.gz 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74059.log 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40419.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34492.log 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.40419.log.gz 2026-03-24T12:09:12.551 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34552.log 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74059.log.gz 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34492.log.gz 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37702.log 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34552.log: gzip -5 --verbose 0.0% -- -- replaced with /var/log/ceph/ceph-client.admin.34552.log.gz /var/log/ceph/ceph-client.admin.65030.log 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.552 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37702.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49028.log 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65030.log.gz 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46646.log 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.37702.log.gz 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64187.log 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49028.log.gz 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36249.log 0.0% 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.46646.log.gz 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64187.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51187.log 2026-03-24T12:09:12.553 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64187.log.gz 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36249.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41448.log 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36249.log.gz 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51187.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79100.log 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51187.log.gz 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44513.log 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41448.log.gz 2026-03-24T12:09:12.554 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79100.log.gz 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43751.log 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31621.log 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44513.log.gz 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43751.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37841.log 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.43751.log.gz 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31621.log.gz 2026-03-24T12:09:12.555 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64264.log 2026-03-24T12:09:12.556 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37841.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42794.log 2026-03-24T12:09:12.556 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64264.log.gz 2026-03-24T12:09:12.556 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72568.log 2026-03-24T12:09:12.556 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37841.log.gz 2026-03-24T12:09:12.556 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42794.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74008.log 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72568.log: 57.2% -- replaced with /var/log/ceph/ceph-client.admin.42794.log.gz 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59459.log 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.72568.log.gz 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74008.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39339.log 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74008.log.gz 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63477.log 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59459.log: /var/log/ceph/ceph-client.admin.39339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59459.log.gz 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39339.log.gz 2026-03-24T12:09:12.557 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83015.log 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63477.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61935.log 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63477.log.gz 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83015.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59024.log 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83015.log.gz 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61935.log.gz 2026-03-24T12:09:12.558 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46387.log 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63437.log 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59024.log.gz 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55588.log 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46387.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46387.log.gz 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63437.log.gz 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66522.log 2026-03-24T12:09:12.559 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55588.log.gz 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36419.log 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77290.log 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66522.log.gz/var/log/ceph/ceph-client.admin.36419.log: 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36419.log.gz 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89521.log 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89671.log 2026-03-24T12:09:12.560 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77290.log.gz 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89521.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.78962.log -- replaced with /var/log/ceph/ceph-client.admin.89521.log.gz 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89671.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61151.log 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89671.log.gz 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78962.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55739.log 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78962.log.gz 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61151.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36147.log 2026-03-24T12:09:12.561 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61151.log.gz 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55739.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63973.log 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55739.log.gz 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36147.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41699.log 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36147.log.gz 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63973.log.gz 2026-03-24T12:09:12.562 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48254.log 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41699.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91956.log 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48254.log: 17.8% -- replaced with /var/log/ceph/ceph-client.admin.41699.log.gzgzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48254.log.gz -5 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.68013.log 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91956.log.gz 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67196.log 2026-03-24T12:09:12.563 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89779.log 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68013.log.gz 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49953.log 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67196.log.gz 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89779.log.gz 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53616.log 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49953.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88378.log 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49953.log.gz 2026-03-24T12:09:12.564 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53616.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77075.log 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53616.log.gz 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88378.log.gz 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64207.log 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77075.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56307.log 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77075.log.gz 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr:gzip/var/log/ceph/ceph-client.admin.64207.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.84001.log 2026-03-24T12:09:12.565 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64207.log.gz 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56307.log.gz 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86556.log 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84001.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.84001.log.gz 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.45135.log 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86556.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38150.log 2026-03-24T12:09:12.566 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86556.log.gz 2026-03-24T12:09:12.567 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45135.log.gz 2026-03-24T12:09:12.567 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87892.log 2026-03-24T12:09:12.567 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38150.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45157.log 2026-03-24T12:09:12.567 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38150.log.gz 2026-03-24T12:09:12.567 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87892.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36977.log 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87892.log.gz 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45157.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71502.log 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45157.log.gz 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36977.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62713.log 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36977.log.gz 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71502.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58305.log 2026-03-24T12:09:12.568 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71502.log.gz 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62713.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62403.log 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62713.log.gz 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58305.log.gz 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30977.log 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62403.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52858.log 2026-03-24T12:09:12.569 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62403.log.gz 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30977.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46721.log 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30977.log.gz 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52858.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68163.log 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52858.log.gz 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46721.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41276.log 2026-03-24T12:09:12.570 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46721.log.gz 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68163.log.gz 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86900.log 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41276.log.gz 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52472.log 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86900.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68314.log 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86900.log.gz 2026-03-24T12:09:12.571 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52472.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69620.log 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr: 54.5% -- replaced with /var/log/ceph/ceph-client.admin.52472.log.gz 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68314.log.gz 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82714.log 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38675.log 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69620.log.gz 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82714.log.gz 2026-03-24T12:09:12.572 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85235.log 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38675.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54259.log 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38675.log.gz 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82908.log 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85235.log: 29.6% -- replaced with /var/log/ceph/ceph-client.admin.85235.log.gz/var/log/ceph/ceph-client.admin.54259.log: 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45071.log 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.54259.log.gz 2026-03-24T12:09:12.573 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82908.log.gz 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34172.log 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45071.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45071.log.gz 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.57673.log 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34172.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54483.log 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34172.log.gz 2026-03-24T12:09:12.574 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57673.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43166.log 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57673.log.gz 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54483.log.gz 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82779.log 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43166.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63380.log 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43166.log.gz 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82779.log.gz 2026-03-24T12:09:12.575 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41636.log 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63380.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67798.log 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.63380.log.gz 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84280.log 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41636.log.gz 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67798.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64014.log 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67798.log.gz 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50480.log 2026-03-24T12:09:12.576 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84280.log.gz 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73650.log 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64014.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50628.log 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64014.log.gz 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50480.log.gz 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73650.log.gz 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26576.log 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50628.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.49738.log -- replaced with /var/log/ceph/ceph-client.admin.50628.log.gz 2026-03-24T12:09:12.577 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.578 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26576.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68228.log 2026-03-24T12:09:12.578 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26576.log.gz 2026-03-24T12:09:12.578 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49738.log.gz 2026-03-24T12:09:12.578 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91612.log 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68228.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88107.log 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68228.log.gz 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72447.log 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91612.log.gz 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88107.log.gz 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48047.log 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72447.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35529.log 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72447.log.gz 2026-03-24T12:09:12.579 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64224.log 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48047.log.gz 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35529.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48900.log 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35529.log.gz 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64224.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88656.log 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64224.log.gz 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48900.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41657.log 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48900.log.gz 2026-03-24T12:09:12.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88656.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78070.log 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88656.log.gz 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41657.log.gz 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77354.log 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31535.log 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78070.log.gz 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77354.log.gz 2026-03-24T12:09:12.581 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50821.log 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46315.log 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31535.log.gz 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50821.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55760.log 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50821.log.gz 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46315.log.gz 2026-03-24T12:09:12.582 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60700.log 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62592.log 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55760.log.gz 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60700.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81758.log 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60700.log.gz 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62592.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.62592.log.gz --verbose 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.33632.log 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr:gzip/var/log/ceph/ceph-client.admin.81758.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.36501.log 2026-03-24T12:09:12.583 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81758.log.gz 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33632.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45460.log 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33632.log.gz 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36501.log.gz 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28459.log 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45460.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.45460.log.gz 2026-03-24T12:09:12.584 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.26911.log 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28459.log.gz 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45439.log 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26911.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86123.log 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26911.log.gz 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40803.log 2026-03-24T12:09:12.585 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45439.log.gz 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86123.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44813.log 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86123.log.gz 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40803.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40803.log.gz -5 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.90488.log 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44813.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.44813.log.gz 2026-03-24T12:09:12.586 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.87591.log 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90488.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34312.log 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90488.log.gz 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87591.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72667.log 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87591.log.gz 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34312.log.gz 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34632.log 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72667.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50521.log 2026-03-24T12:09:12.587 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72667.log.gz 2026-03-24T12:09:12.588 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34632.log.gz 2026-03-24T12:09:12.588 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60141.log 2026-03-24T12:09:12.588 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30326.log 2026-03-24T12:09:12.588 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50521.log.gz 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60141.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77498.log 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60141.log.gz 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30326.log.gz 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50671.log 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77498.log.gz 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50168.log 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50671.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56797.log 2026-03-24T12:09:12.589 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50671.log.gz 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50168.log.gz 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25483.log 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56797.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49243.log 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56797.log.gz 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25483.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26533.log 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25483.log.gz 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49243.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64990.log 2026-03-24T12:09:12.590 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49243.log.gz 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26533.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88193.log 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26533.log.gz 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64990.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81178.log 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64990.log.gz 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88193.log.gz 2026-03-24T12:09:12.591 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31660.log 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81178.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51294.log 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81178.log.gz 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89387.log 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51294.log: 1.2%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64090.log 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.31660.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51294.log.gz 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89387.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84543.log 2026-03-24T12:09:12.592 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89387.log.gz 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43594.log 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64090.log: /var/log/ceph/ceph-client.admin.84543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64090.log.gz 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84543.log.gz 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43272.log 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43594.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90703.log 2026-03-24T12:09:12.593 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43594.log.gz 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43272.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78821.log 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.43272.log.gz 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90703.log.gz 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62731.log 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78821.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89182.log 2026-03-24T12:09:12.594 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78821.log.gz 2026-03-24T12:09:12.595 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62731.log.gz 2026-03-24T12:09:12.595 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87217.log 2026-03-24T12:09:12.595 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89182.log.gz 2026-03-24T12:09:12.595 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33212.log 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87217.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55610.log 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87217.log.gz 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33212.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26383.log 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33212.log.gz 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55610.log.gz 2026-03-24T12:09:12.596 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68206.log 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26383.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43064.log 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26383.log.gz 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68206.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68951.log 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68206.log.gz 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43064.log.gz 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57586.log 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68951.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75152.log 2026-03-24T12:09:12.597 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68951.log.gz 2026-03-24T12:09:12.598 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57586.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39190.log 2026-03-24T12:09:12.598 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75152.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75152.log.gz 2026-03-24T12:09:12.598 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80810.log 2026-03-24T12:09:12.598 INFO:teuthology.orchestra.run.vm05.stderr: 51.3% -- replaced with /var/log/ceph/ceph-client.admin.57586.log.gz 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39190.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41018.log 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39190.log.gz 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80810.log.gz 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42958.log 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41018.log.gz 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74153.log 2026-03-24T12:09:12.599 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42958.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44555.log 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74153.log.gz 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74778.log 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44555.log.gz 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74571.log 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr: 54.4% -- replaced with /var/log/ceph/ceph-client.admin.42958.log.gz 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74778.log.gz 2026-03-24T12:09:12.600 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85758.log 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74571.log.gz 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64697.log 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85758.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80233.log 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85758.log.gz 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64697.log.gz 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33040.log 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80233.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80233.log.gz 2026-03-24T12:09:12.601 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32256.log 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33040.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56711.log 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32256.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33040.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68645.log 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32256.log.gz 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56711.log.gz 2026-03-24T12:09:12.602 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87183.log 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29746.log 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68645.log: /var/log/ceph/ceph-client.admin.87183.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87183.log.gz 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68645.log.gz 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53073.log 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29746.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32819.log 2026-03-24T12:09:12.603 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.29746.log.gz 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53073.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81092.log 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53073.log.gz 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32819.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27320.log 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32819.log.gz 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81092.log.gz 2026-03-24T12:09:12.604 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54030.log 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27320.log.gz 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72901.log 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54030.log.gz 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87978.log 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72901.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31677.log 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72901.log.gz 2026-03-24T12:09:12.605 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87978.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87978.log.gz 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57326.log 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31677.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32359.log 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57326.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31677.log.gz 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57326.log.gz 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27452.log 2026-03-24T12:09:12.606 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32359.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64870.log 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27452.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32359.log.gz 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71266.log 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27452.log.gz 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64870.log.gz 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30004.log 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71266.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71266.log.gz 2026-03-24T12:09:12.607 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.85822.log 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30004.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81003.log 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30004.log.gz 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85822.log.gz 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69212.log 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81003.log.gz 2026-03-24T12:09:12.608 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69878.log 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69212.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53497.log 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69212.log.gz 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69878.log.gz 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74939.log 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53497.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53497.log.gz -5 2026-03-24T12:09:12.609 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.45589.log 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74939.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49695.log 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74939.log.gz 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45589.log.gz 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72606.log 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49695.log.gz 2026-03-24T12:09:12.610 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26837.log 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72606.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30176.log 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72606.log.gz 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26837.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69089.log 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26837.log.gz 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30176.log.gz 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46085.log 2026-03-24T12:09:12.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69089.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61302.log 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.69089.log.gz/var/log/ceph/ceph-client.admin.46085.log: 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46085.log.gz 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83495.log 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61302.log.gz 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76255.log 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29617.log 2026-03-24T12:09:12.612 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83495.log.gz 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76255.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72060.log 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76255.log.gz 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29617.log.gz 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54504.log 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72060.log.gz 2026-03-24T12:09:12.613 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62694.log 2026-03-24T12:09:12.614 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71223.log 2026-03-24T12:09:12.614 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54504.log.gz 2026-03-24T12:09:12.614 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62694.log.gz 2026-03-24T12:09:12.614 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53356.log 2026-03-24T12:09:12.614 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71223.log.gz 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55889.log 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53356.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88085.log 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53356.log.gz 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55889.log.gz 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47559.log 2026-03-24T12:09:12.615 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82628.log 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88085.log: gzip/var/log/ceph/ceph-client.admin.47559.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.50018.log 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88085.log.gz 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47559.log.gz 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82628.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.82628.log.gz 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.75665.log 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50018.log.gz 2026-03-24T12:09:12.616 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69513.log 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75665.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40048.log 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.75665.log.gz 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69513.log.gz 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75204.log 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40048.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.40048.log.gz -5 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.89994.log 2026-03-24T12:09:12.617 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75204.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71739.log 2026-03-24T12:09:12.618 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75204.log.gz 2026-03-24T12:09:12.618 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89994.log.gz 2026-03-24T12:09:12.618 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53223.log 2026-03-24T12:09:12.618 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71739.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57815.log 2026-03-24T12:09:12.618 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71739.log.gz 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53223.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73499.log 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57815.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57815.log.gz 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.53223.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77984.log 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73499.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.73499.log.gz --verbose 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.36586.log 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77984.log.gz 2026-03-24T12:09:12.619 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31450.log 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36586.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33368.log 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36586.log.gz 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31450.log.gz 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39127.log 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90961.log 2026-03-24T12:09:12.620 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33368.log: /var/log/ceph/ceph-client.admin.39127.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.33368.log.gz 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41802.log 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.39127.log.gz 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54611.log 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90961.log.gz 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41802.log.gz 2026-03-24T12:09:12.621 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83743.log 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73177.log 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54611.log: /var/log/ceph/ceph-client.admin.83743.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81916.log 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83743.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54611.log.gz 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73177.log.gz 2026-03-24T12:09:12.622 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48686.log 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81916.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.81916.log.gz 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.29231.log 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48686.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77452.log 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.48686.log.gz 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29231.log.gz 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89409.log 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77452.log.gz 2026-03-24T12:09:12.623 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57917.log 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89409.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83376.log 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89409.log.gz 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57917.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81070.log 0.0% 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.57917.log.gz 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83376.log.gz 2026-03-24T12:09:12.624 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80011.log 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72123.log 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81070.log.gz 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80011.log.gz 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78521.log 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72123.log.gz 2026-03-24T12:09:12.625 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85392.log 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78521.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81950.log 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78521.log.gz 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85392.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46626.log 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85392.log.gz 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81950.log.gz 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49652.log 2026-03-24T12:09:12.626 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46626.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67841.log 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46626.log.gz 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49652.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81848.log 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49652.log.gz 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67841.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67841.log.gz 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48108.log 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81848.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26891.log 2026-03-24T12:09:12.627 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48108.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35855.log 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48108.log.gz 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26891.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59477.log 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26891.log.gz 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35855.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64445.log 2026-03-24T12:09:12.628 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35855.log.gz 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59477.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71911.log 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59477.log.gz 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64445.log.gz 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35906.log 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55502.log 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71911.log.gz 2026-03-24T12:09:12.629 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35906.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32836.log 2026-03-24T12:09:12.630 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35906.log.gz 2026-03-24T12:09:12.630 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55502.log.gz 2026-03-24T12:09:12.630 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27434.log 2026-03-24T12:09:12.630 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32836.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30913.log 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27434.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32836.log.gz 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54114.log 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27434.log.gz 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30913.log.gz 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54964.log 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54114.log.gz 2026-03-24T12:09:12.631 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40825.log 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54964.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38255.log 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54964.log.gz 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25968.log 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40825.log.gz 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38255.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54749.log 2026-03-24T12:09:12.632 INFO:teuthology.orchestra.run.vm05.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38255.log.gz 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25968.log.gz 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64505.log 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54749.log.gz 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79322.log 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64505.log.gz 2026-03-24T12:09:12.633 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79663.log 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79322.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25606.log 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79322.log.gz 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79663.log.gz 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65131.log 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25606.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.25606.log.gz /var/log/ceph/ceph-client.admin.74077.log 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65131.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74830.log 2026-03-24T12:09:12.634 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65131.log.gz 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74077.log.gz 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74286.log 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74830.log.gz 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72143.log 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30808.log 2026-03-24T12:09:12.635 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74286.log.gz 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72143.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72143.log.gz 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45200.log 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30808.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87677.log 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30808.log.gz 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45200.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41169.log 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.45200.log.gz 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87677.log.gz 2026-03-24T12:09:12.636 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82951.log 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41169.log.gz 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69835.log 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82951.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60592.log 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82951.log.gz 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69835.log.gz 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85286.log 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60592.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60592.log.gz -5 2026-03-24T12:09:12.637 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.64244.log 2026-03-24T12:09:12.638 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62384.log 2026-03-24T12:09:12.638 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85286.log.gz 2026-03-24T12:09:12.638 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64244.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.64244.log.gz 2026-03-24T12:09:12.638 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.55050.log 2026-03-24T12:09:12.638 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62384.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87806.log 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62384.log.gz 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55050.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81586.log 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.55050.log.gz 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87806.log.gz 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86338.log 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81586.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.81586.log.gz 2026-03-24T12:09:12.639 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.43971.log 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86338.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29252.log 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86338.log.gz 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43971.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48005.log 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.43971.log.gz 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29252.log.gz 2026-03-24T12:09:12.640 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38959.log 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48005.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69405.log 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48005.log.gz 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38959.log.gz 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26275.log 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69405.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69405.log.gz -5 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.74688.log 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26275.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59146.log 2026-03-24T12:09:12.641 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26275.log.gz 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74688.log.gz 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73091.log 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59146.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37240.log 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59146.log.gz 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73091.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40911.log 2026-03-24T12:09:12.642 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73091.log.gz 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37240.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45568.log 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.37240.log.gz 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40911.log.gz 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34052.log 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45568.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58391.log 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45568.log.gz 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34052.log.gz 2026-03-24T12:09:12.643 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34152.log 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58391.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.58391.log.gz -5 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.34992.log 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34152.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75080.log 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34152.log.gz 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34992.log.gz 2026-03-24T12:09:12.644 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41740.log 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75080.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56013.log 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75080.log.gz 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41740.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51746.log 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41740.log.gz 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56013.log.gz 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75061.log 2026-03-24T12:09:12.645 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51746.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54525.log 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51746.log.gz 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75061.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25638.log 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75061.log.gz 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54525.log.gz 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40398.log 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25638.log.gzgzip 2026-03-24T12:09:12.646 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.74450.log 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40398.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68830.log 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40398.log.gz 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74450.log.gz 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83373.log 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68830.log.gz 2026-03-24T12:09:12.647 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54856.log 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83373.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64676.log 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr: 83.5%/var/log/ceph/ceph-client.admin.54856.log: -- replaced with /var/log/ceph/ceph-client.admin.83373.log.gz 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54856.log.gz 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42385.log 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64676.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79508.log 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64676.log.gz 2026-03-24T12:09:12.648 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72166.log 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42385.log: /var/log/ceph/ceph-client.admin.79508.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56434.log 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79508.log.gz 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.42385.log.gz 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72166.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57794.log 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr: 56.3% -- replaced with /var/log/ceph/ceph-client.admin.72166.log.gz 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56434.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46106.log 2026-03-24T12:09:12.649 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56434.log.gz 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57794.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66895.log 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr: 26.4%/var/log/ceph/ceph-client.admin.46106.log: -- replaced with /var/log/ceph/ceph-client.admin.57794.log.gz 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46106.log.gz 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82564.log 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66895.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75279.log 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66895.log.gz 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82564.log.gz 2026-03-24T12:09:12.650 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84758.log 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75279.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51208.log 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75279.log.gz 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84758.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66375.log 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84758.log.gz 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51208.log.gz 2026-03-24T12:09:12.651 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27750.log 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66375.log: gzip 0.0% -5 --verbose -- /var/log/ceph/ceph-client.admin.36283.log 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.66375.log.gz 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27750.log.gz 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68518.log 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36283.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64793.log 2026-03-24T12:09:12.652 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36283.log.gz 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68518.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45417.log 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68518.log.gz 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64793.log.gz 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38612.log 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45417.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45417.log.gz 2026-03-24T12:09:12.653 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74374.log 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38612.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39704.log 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74374.log: 26.4% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74374.log.gz 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.38612.log.gz 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55631.log 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39704.log.gz 2026-03-24T12:09:12.654 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26340.log 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38843.log 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55631.log.gz 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26340.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26340.log.gz 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.67282.log 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38843.log.gz 2026-03-24T12:09:12.655 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59685.log 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67282.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28779.log 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.67282.log.gz 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59685.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74629.log 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59685.log.gz 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28779.log.gz 2026-03-24T12:09:12.656 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65550.log 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74629.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83319.log 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.74629.log.gz 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65550.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65550.log.gz 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69128.log 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83319.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27009.log 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83319.log.gz 2026-03-24T12:09:12.657 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69128.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36045.log 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27009.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.69128.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27009.log.gz 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38213.log 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36045.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74412.log 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36045.log.gz 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38213.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63150.log 2026-03-24T12:09:12.658 INFO:teuthology.orchestra.run.vm05.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38213.log.gz 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74412.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85509.log 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74412.log.gz 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63150.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52516.log 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63150.log.gz 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85509.log.gz 2026-03-24T12:09:12.659 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43992.log 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52516.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30790.log 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43992.log: 52.7% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43992.log.gz 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.52516.log.gz 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68581.log 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30790.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72589.log 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30790.log.gz 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68581.log.gz 2026-03-24T12:09:12.660 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62190.log 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72589.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69986.log 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72589.log.gz 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62190.log.gz 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79548.log 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69986.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89736.log 2026-03-24T12:09:12.661 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69986.log.gz 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79548.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78027.log 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79548.log.gz 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89736.log.gz 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63361.log 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78027.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28801.log 2026-03-24T12:09:12.662 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78027.log.gz 2026-03-24T12:09:12.663 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72919.log 2026-03-24T12:09:12.663 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63361.log: /var/log/ceph/ceph-client.admin.28801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28801.log.gz 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63361.log.gz 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35332.log 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88820.log 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72919.log: /var/log/ceph/ceph-client.admin.35332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72919.log.gz 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35332.log.gz 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41905.log 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88820.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78538.log 2026-03-24T12:09:12.664 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88820.log.gz 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41905.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43651.log 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.41905.log.gz 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78538.log.gz 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83721.log 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38801.log 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43651.log.gz 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83721.log.gz 2026-03-24T12:09:12.665 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43126.log 2026-03-24T12:09:12.666 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38801.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32938.log 2026-03-24T12:09:12.666 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38801.log.gz 2026-03-24T12:09:12.666 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43126.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80702.log 2026-03-24T12:09:12.666 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43126.log.gz 2026-03-24T12:09:12.666 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32938.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39022.log 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32938.log.gz 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80702.log.gz 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63046.log 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39022.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39855.log 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63046.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.39022.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63046.log.gz 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.667 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53245.log 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39855.log.gz 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30541.log 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53245.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53245.log.gz -5 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.48194.log 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30541.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60915.log 2026-03-24T12:09:12.668 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30541.log.gz 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48194.log.gz 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43950.log 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60915.log.gz 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43573.log 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43950.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34912.log 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43950.log.gz 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43573.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43573.log.gz 2026-03-24T12:09:12.669 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42304.log 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34912.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54280.log 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34912.log.gz 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42304.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61345.log 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42304.log.gz 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70047.log 2026-03-24T12:09:12.670 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.54280.log.gz 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61345.log.gz 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29724.log 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.70047.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89628.log 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr: 53.6% -- replaced with /var/log/ceph/ceph-client.admin.70047.log.gz 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29724.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.29724.log.gz -5 2026-03-24T12:09:12.671 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.81393.log 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89628.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83958.log 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89628.log.gz 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65980.log 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81393.log.gz 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83958.log.gz 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67389.log 2026-03-24T12:09:12.672 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65980.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32751.log 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65980.log.gz 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67389.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33652.log 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67389.log.gz 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32751.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77247.log 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32751.log.gz 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr:gzip/var/log/ceph/ceph-client.admin.33652.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.68077.log 2026-03-24T12:09:12.673 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33652.log.gz 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77247.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81801.log 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77247.log.gz 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68077.log.gz 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62362.log 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81801.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81522.log 2026-03-24T12:09:12.674 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81801.log.gz 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62362.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50864.log 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81522.log: 58.2% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81522.log.gz 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62362.log.gz 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27032.log 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50864.log.gz 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68747.log 2026-03-24T12:09:12.675 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27032.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74706.log 2026-03-24T12:09:12.676 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27032.log.gz 2026-03-24T12:09:12.676 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68747.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72023.log 2026-03-24T12:09:12.676 INFO:teuthology.orchestra.run.vm05.stderr: 12.0% -- replaced with /var/log/ceph/ceph-client.admin.68747.log.gz 2026-03-24T12:09:12.676 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74706.log.gz 2026-03-24T12:09:12.676 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31429.log 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72023.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84629.log 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr: 10.9% -- replaced with /var/log/ceph/ceph-client.admin.72023.log.gz 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31429.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61552.log 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31429.log.gz 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84629.log.gz 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33792.log 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61552.log.gz 2026-03-24T12:09:12.677 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32410.log 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33792.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55071.log 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33792.log.gz 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32410.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79804.log 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32410.log.gz 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55071.log.gz 2026-03-24T12:09:12.678 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54899.log 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79804.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31171.log 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.79804.log.gz 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54899.log.gz 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87505.log 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31171.log.gz 2026-03-24T12:09:12.679 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74590.log 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87505.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87505.log.gz 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35821.log 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74590.log.gz 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66562.log 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35821.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35821.log.gz 2026-03-24T12:09:12.680 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30870.log 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66562.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30631.log 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66562.log.gz 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30870.log.gz 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55481.log 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30631.log.gz 2026-03-24T12:09:12.681 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27005.log 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55481.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43730.log 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55481.log.gz 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27005.log.gz 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86016.log 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43730.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43730.log.gz 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69190.log 2026-03-24T12:09:12.682 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86016.log.gz 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78572.log 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69190.log.gz 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81350.log 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78572.log.gz 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76881.log 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81350.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83353.log 2026-03-24T12:09:12.683 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81350.log.gz 2026-03-24T12:09:12.684 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76881.log.gz 2026-03-24T12:09:12.684 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53094.log 2026-03-24T12:09:12.684 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83353.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83353.log.gz 2026-03-24T12:09:12.684 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53094.log.gz 2026-03-24T12:09:12.684 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50104.log 2026-03-24T12:09:12.685 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87456.log 2026-03-24T12:09:12.685 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50104.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52879.log 2026-03-24T12:09:12.685 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50104.log.gz 2026-03-24T12:09:12.685 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87456.log.gz 2026-03-24T12:09:12.685 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30348.log 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52879.log.gz 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82413.log 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30348.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82271.log 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30348.log.gz 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82413.log.gz 2026-03-24T12:09:12.686 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68142.log 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82271.log.gz 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42046.log 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68142.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76860.log 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.68142.log.gz 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42046.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62149.log 2026-03-24T12:09:12.687 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.42046.log.gz 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76860.log.gz 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73776.log 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62149.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27647.log 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.62149.log.gz 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73776.log.gz 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74025.log 2026-03-24T12:09:12.688 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27647.log.gz 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51682.log 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74025.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60614.log 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74025.log.gz 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51682.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51682.log.gz 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85416.log 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60614.log.gz 2026-03-24T12:09:12.689 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80745.log 2026-03-24T12:09:12.690 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89564.log 2026-03-24T12:09:12.690 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85416.log.gz 2026-03-24T12:09:12.690 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80745.log.gz 2026-03-24T12:09:12.690 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26125.log 2026-03-24T12:09:12.690 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89564.log.gz 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53008.log 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26125.log.gz 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55717.log 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53008.log.gz 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56646.log 2026-03-24T12:09:12.691 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55717.log.gz 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31343.log 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56646.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27282.log 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56646.log.gz 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31343.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31343.log.gz 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78393.log 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27282.log.gz 2026-03-24T12:09:12.692 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65679.log 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69362.log 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78393.log.gz 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65679.log.gz 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40280.log 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69362.log.gz 2026-03-24T12:09:12.693 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50270.log 2026-03-24T12:09:12.694 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32136.log 2026-03-24T12:09:12.694 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40280.log.gz 2026-03-24T12:09:12.694 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50270.log.gz 2026-03-24T12:09:12.694 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65150.log 2026-03-24T12:09:12.694 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32136.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37862.log 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65150.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75769.log 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65150.log.gz 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37862.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81242.log 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75769.log.gz 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33263.log 2026-03-24T12:09:12.695 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81242.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.37862.log.gz 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66216.log 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81242.log.gz 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33263.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59851.log 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32136.log.gz -- replaced with /var/log/ceph/ceph-client.admin.33263.log.gz 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77898.log 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66216.log.gz 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59851.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43208.log 2026-03-24T12:09:12.696 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59851.log.gz 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77898.log.gz 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36943.log 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43208.log.gz 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81500.log 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36943.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34012.log 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36943.log.gz 2026-03-24T12:09:12.697 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81500.log.gz 2026-03-24T12:09:12.698 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55545.log 2026-03-24T12:09:12.698 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34012.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34012.log.gz 2026-03-24T12:09:12.698 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33992.log 2026-03-24T12:09:12.698 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55545.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36467.log 2026-03-24T12:09:12.698 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55545.log.gz 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33992.log.gz 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59828.log 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36467.log.gz 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55352.log 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59828.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60635.log 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.59828.log.gz 2026-03-24T12:09:12.699 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55352.log.gz 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62093.log 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60635.log.gz 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34192.log 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62093.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44071.log 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62093.log.gz 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34192.log.gz 2026-03-24T12:09:12.700 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74885.log 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44071.log.gz 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48338.log 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74885.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44834.log 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74885.log.gz 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48338.log.gz 2026-03-24T12:09:12.701 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33229.log 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44834.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27508.log 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33229.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88171.log 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr: 29.9% -- replaced with /var/log/ceph/ceph-client.admin.44834.log.gz 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27508.log.gz 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr: 1.2%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58284.log 2026-03-24T12:09:12.702 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.33229.log.gz 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27244.log 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88171.log: /var/log/ceph/ceph-client.admin.58284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88171.log.gz 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58284.log.gz 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26929.log 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27244.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78838.log 2026-03-24T12:09:12.703 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26929.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30390.log 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78838.log.gz 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26929.log.gz 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69728.log 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30390.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79146.log 2026-03-24T12:09:12.704 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69728.log.gz 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77096.log 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30390.log.gz 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79146.log.gz 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88756.log 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77096.log.gz 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45740.log 2026-03-24T12:09:12.705 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88756.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22818.log 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88756.log.gz 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45740.log.gz 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44727.log 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22818.log.gz 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63634.log 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44727.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63340.log 2026-03-24T12:09:12.706 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.44727.log.gz 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87827.log/var/log/ceph/ceph-client.admin.63634.log: 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63634.log.gz 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63340.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22903.log 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63340.log.gz 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87827.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65010.log 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87827.log.gz 2026-03-24T12:09:12.707 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22903.log.gz 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86620.log 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65010.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83268.log 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65010.log.gz 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86620.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43085.log 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86620.log.gz 2026-03-24T12:09:12.708 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83268.log.gz 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84237.log 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40598.log 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43085.log.gz 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84237.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40440.log 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84237.log.gz 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40598.log.gz 2026-03-24T12:09:12.709 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87870.log 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40440.log.gz 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49416.log 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87870.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40581.log 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87870.log.gz 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64486.log 2026-03-24T12:09:12.710 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49416.log.gz 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40581.log.gz 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44233.log 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64486.log.gz 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79490.log 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44233.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32495.log 2026-03-24T12:09:12.711 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79490.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.44233.log.gz 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79490.log.gz 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49545.log 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54070.log 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr: 2.5%/var/log/ceph/ceph-client.admin.49545.log: -- replaced with /var/log/ceph/ceph-client.admin.32495.log.gz 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49545.log.gz 2026-03-24T12:09:12.712 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75398.log 2026-03-24T12:09:12.713 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61972.log 2026-03-24T12:09:12.713 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54070.log.gz 2026-03-24T12:09:12.713 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75398.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.86037.log 2026-03-24T12:09:12.713 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.75398.log.gz 2026-03-24T12:09:12.713 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60893.log 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61972.log.gz 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86037.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79568.log 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86037.log.gz 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34092.log 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60893.log.gz 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79568.log.gz 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75868.log 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34092.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79606.log 2026-03-24T12:09:12.714 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.34092.log.gz 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75868.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75868.log.gz 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66044.log 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90789.log 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79606.log.gz 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66044.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44770.log 2026-03-24T12:09:12.715 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66044.log.gz 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90789.log.gz 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42815.log 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88278.log 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44770.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44770.log.gz 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42815.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41598.log 2026-03-24T12:09:12.716 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42815.log.gz 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88278.log.gz 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33832.log 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31932.log 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41598.log: /var/log/ceph/ceph-client.admin.33832.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32648.log 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.41598.log.gz 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33832.log.gz 2026-03-24T12:09:12.717 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31932.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38822.log 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31932.log.gz 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47362.log 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32648.log: /var/log/ceph/ceph-client.admin.38822.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67475.log 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32648.log.gz 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47362.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38822.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47362.log.gz 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.718 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85145.log 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67475.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84366.log 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67475.log.gz 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85145.log.gz 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71201.log 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84366.log.gz 2026-03-24T12:09:12.719 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44051.log 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66109.log 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71201.log.gz 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44051.log: gzip -5 --verbose -- /var/log/ceph/ceph-mgr.x.log 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.44051.log.gz 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66109.log.gzgzip 2026-03-24T12:09:12.720 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.73628.log 2026-03-24T12:09:12.721 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39043.log 2026-03-24T12:09:12.721 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-mgr.x.log: /var/log/ceph/ceph-client.admin.73628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73628.log.gz 2026-03-24T12:09:12.721 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36892.log 2026-03-24T12:09:12.722 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39043.log: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.39043.log.gz 2026-03-24T12:09:12.722 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68292.log 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61763.log 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36892.log.gz 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84411.log 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68292.log.gz 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61763.log.gz 2026-03-24T12:09:12.723 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53720.log 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33110.log 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84411.log.gz 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60506.log 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53720.log.gz 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33110.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66274.log 2026-03-24T12:09:12.724 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33110.log.gz 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26967.log 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60506.log.gz 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66274.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37282.log 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66274.log.gz 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26967.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56840.log 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26967.log.gz 2026-03-24T12:09:12.725 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56454.log 2026-03-24T12:09:12.726 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37282.log: /var/log/ceph/ceph-client.admin.56840.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55438.log 2026-03-24T12:09:12.726 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56840.log.gz 2026-03-24T12:09:12.726 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56454.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37282.log.gz 2026-03-24T12:09:12.726 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56454.log.gz 2026-03-24T12:09:12.726 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25710.log 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69384.log 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55438.log.gz 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69706.log 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25710.log.gz 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69384.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.69384.log.gz 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.88716.log 2026-03-24T12:09:12.727 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88398.log 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69706.log.gz 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33952.log 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88716.log.gz 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88398.log.gz 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56391.log 2026-03-24T12:09:12.728 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58047.log 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33952.log.gz 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56391.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73241.log 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56391.log.gz 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58047.log.gz 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63817.log 2026-03-24T12:09:12.729 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64950.log 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73241.log.gz 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63817.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69110.log 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63817.log.gz 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64950.log.gz 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31192.log 2026-03-24T12:09:12.730 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50061.log 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69110.log.gz 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31192.log.gz 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66603.log 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42898.log 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50061.log: /var/log/ceph/ceph-client.admin.66603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50061.log.gz 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87419.log 2026-03-24T12:09:12.731 INFO:teuthology.orchestra.run.vm05.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.66603.log.gz 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42898.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60356.log 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42898.log.gz 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86209.log 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87419.log: /var/log/ceph/ceph-client.admin.60356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87419.log.gz 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60356.log.gz 2026-03-24T12:09:12.732 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29939.log 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86209.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28739.log 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86209.log.gz 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29939.log.gz 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87720.log 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73991.log 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28739.log.gz 2026-03-24T12:09:12.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87720.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75933.log 2026-03-24T12:09:12.734 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87720.log.gz 2026-03-24T12:09:12.734 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73991.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73991.log.gz 2026-03-24T12:09:12.734 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36232.log 2026-03-24T12:09:12.734 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27148.log 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75933.log.gz 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68686.log 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36232.log.gz 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80160.log 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27148.log.gz 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49329.log 2026-03-24T12:09:12.735 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68686.log.gz 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80160.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44534.log 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80160.log.gz 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49329.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54216.log 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49329.log.gz 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69942.log 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44534.log.gz 2026-03-24T12:09:12.736 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56202.log 2026-03-24T12:09:12.737 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54216.log.gz 2026-03-24T12:09:12.737 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69942.log.gz 2026-03-24T12:09:12.737 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67045.log 2026-03-24T12:09:12.737 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56202.log.gz 2026-03-24T12:09:12.737 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49824.log 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67045.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41362.log 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67045.log.gz 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49824.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46262.log 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49824.log.gz 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41362.log.gz 2026-03-24T12:09:12.738 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29660.log 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83251.log 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46262.log.gz 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42267.log 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29660.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29660.log.gz 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83251.log.gz 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80573.log 2026-03-24T12:09:12.739 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84995.log 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42267.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30068.log 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80573.log: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.42267.log.gz 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80573.log.gz 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84995.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84995.log.gz 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57526.log 2026-03-24T12:09:12.740 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62481.log 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30068.log.gz 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39747.log 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57526.log.gz 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62481.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73938.log 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62481.log.gz 2026-03-24T12:09:12.741 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39747.log.gz 2026-03-24T12:09:12.742 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38360.log 2026-03-24T12:09:12.742 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47319.log 2026-03-24T12:09:12.742 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73938.log.gz 2026-03-24T12:09:12.742 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66294.log 2026-03-24T12:09:12.742 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38360.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88796.log 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47319.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38360.log.gz 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47319.log.gz 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66294.log.gz 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39790.log 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81882.log 2026-03-24T12:09:12.743 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88796.log.gz 2026-03-24T12:09:12.744 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39790.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37765.log 2026-03-24T12:09:12.744 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39790.log.gz 2026-03-24T12:09:12.744 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81882.log.gz 2026-03-24T12:09:12.744 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80616.log 2026-03-24T12:09:12.744 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53288.log 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37765.log: /var/log/ceph/ceph-client.admin.80616.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34712.log 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80616.log.gz 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37765.log.gz 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53288.log.gz 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35032.log 2026-03-24T12:09:12.745 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64813.log 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34712.log.gz 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35032.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28115.log 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35032.log.gz 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64813.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55330.log 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64813.log.gz 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78264.log 2026-03-24T12:09:12.746 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28115.log.gz 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53822.log 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55330.log.gz 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78264.log.gz 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47405.log 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88776.log 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53822.log.gz 2026-03-24T12:09:12.747 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78156.log 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47405.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47405.log.gz 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88776.log.gz 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34592.log 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71244.log 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78156.log.gz 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53596.log 2026-03-24T12:09:12.748 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34592.log.gz 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71244.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43830.log 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71244.log.gz 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59168.log 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53596.log.gz 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81865.log 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43830.log.gz 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59168.log.gz 2026-03-24T12:09:12.749 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29466.log 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47233.log 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81865.log.gz 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34432.log 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29466.log.gz 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47233.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47233.log.gz -5 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.82886.log 2026-03-24T12:09:12.750 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81135.log 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34432.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34432.log.gz 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56603.log 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82886.log.gz 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81135.log.gz 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67454.log 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56264.log 2026-03-24T12:09:12.751 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56603.log.gz 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28201.log 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67454.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67454.log.gz 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56264.log.gz 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31762.log 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85865.log 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28201.log.gz 2026-03-24T12:09:12.752 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77791.log 2026-03-24T12:09:12.753 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85865.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60571.log 2026-03-24T12:09:12.753 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85865.log.gz 2026-03-24T12:09:12.753 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31762.log.gz 2026-03-24T12:09:12.753 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77791.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77791.log.gz 2026-03-24T12:09:12.753 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28502.log 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60571.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42587.log 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60571.log.gz 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86274.log 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28502.log.gz 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42587.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72167.log 2026-03-24T12:09:12.754 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42587.log.gz 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40261.log 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72167.log: /var/log/ceph/ceph-client.admin.86274.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78711.log 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.72167.log.gz 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40261.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86274.log.gz 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40261.log.gz 2026-03-24T12:09:12.755 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39661.log 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90080.log 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78711.log.gz 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39661.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74249.log 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39661.log.gz 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90080.log.gz 2026-03-24T12:09:12.756 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42980.log 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49910.log 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74249.log.gz 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42980.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40868.log 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42980.log.gz 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49910.log.gz 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76320.log 2026-03-24T12:09:12.757 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73903.log 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40868.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40868.log.gz 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67067.log 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76320.log.gz 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73903.log.gz 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69962.log 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77375.log 2026-03-24T12:09:12.758 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67067.log.gz 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34412.log 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69962.log: /var/log/ceph/ceph-client.admin.77375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77375.log.gz 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69962.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38759.log 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55179.log 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34412.log: /var/log/ceph/ceph-client.admin.38759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34412.log.gz 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46996.log 2026-03-24T12:09:12.759 INFO:teuthology.orchestra.run.vm05.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38759.log.gz 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55179.log.gz 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85102.log 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46996.log.gz 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86534.log 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85102.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42206.log 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85102.log.gz 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86534.log.gz 2026-03-24T12:09:12.760 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58348.log 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83123.log 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42206.log: /var/log/ceph/ceph-client.admin.58348.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85973.log 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58348.log.gz 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.42206.log.gz 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83123.log.gz 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68911.log 2026-03-24T12:09:12.761 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22776.log 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85973.log.gz 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46889.log 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68911.log.gz 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22776.log.gz 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87438.log 2026-03-24T12:09:12.762 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49221.log 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46889.log.gz 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87438.log.gz 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81672.log 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49221.log.gz 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81608.log 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81672.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86317.log 2026-03-24T12:09:12.763 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81672.log.gz 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81608.log.gz 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57119.log 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83080.log 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86317.log: /var/log/ceph/ceph-client.admin.57119.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75510.log 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86317.log.gz 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57119.log.gz 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83080.log.gz 2026-03-24T12:09:12.764 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63128.log 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76191.log 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75510.log.gz 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63128.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90295.log 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76191.log.gz 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.63128.log.gz 2026-03-24T12:09:12.765 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80487.log 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90295.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86448.log 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90295.log.gz 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80487.log.gz 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74813.log 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40362.log 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86448.log.gz 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47819.log 2026-03-24T12:09:12.766 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74813.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74813.log.gz 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40362.log.gz 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68991.log 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74094.log 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47819.log.gz 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32461.log 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68991.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68991.log.gz 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74094.log.gz 2026-03-24T12:09:12.767 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53202.log 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75318.log 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32461.log: /var/log/ceph/ceph-client.admin.53202.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67303.log 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53202.log.gz 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32461.log.gz 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75318.log.gz 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46172.log 2026-03-24T12:09:12.768 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40889.log 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67303.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67303.log.gz 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56582.log 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46172.log.gz 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40889.log.gz 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42407.log 2026-03-24T12:09:12.769 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85607.log 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56582.log.gz 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42407.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39962.log 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42407.log.gz 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85607.log.gz 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46739.log 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74268.log 2026-03-24T12:09:12.770 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39962.log: /var/log/ceph/ceph-client.admin.46739.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80982.log 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74268.log.gz 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46739.log.gz 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39962.log.gz 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42836.log 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80982.log.gz 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90768.log 2026-03-24T12:09:12.771 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42836.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58456.log 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42836.log.gz 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90768.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.90768.log.gz 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.63934.log 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36096.log 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58456.log.gz 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63934.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66916.log 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63934.log.gz 2026-03-24T12:09:12.772 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36096.log.gz 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49781.log 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60098.log 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66916.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66916.log.gz 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88042.log 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49781.log.gz 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60098.log.gz 2026-03-24T12:09:12.773 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58155.log 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78436.log 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88042.log: /var/log/ceph/ceph-client.admin.58155.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37156.log 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58155.log.gz 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88042.log.gz 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78436.log.gz 2026-03-24T12:09:12.774 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27922.log 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66195.log 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37156.log: /var/log/ceph/ceph-client.admin.27922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27922.log.gz 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79839.log 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37156.log.gz 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66195.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68056.log 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66195.log.gz 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79839.log.gz 2026-03-24T12:09:12.775 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35651.log 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83635.log 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68056.log: /var/log/ceph/ceph-client.admin.35651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39446.log 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68056.log.gz 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35651.log.gz 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83635.log.gz 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78242.log 2026-03-24T12:09:12.776 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44333.log 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39446.log: /var/log/ceph/ceph-client.admin.78242.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44412.log 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39446.log.gz 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78242.log.gz 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44333.log.gz 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58808.log 2026-03-24T12:09:12.777 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44412.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32085.log 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58808.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.44412.log.gz 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58808.log.gz 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60442.log 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26426.log 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60442.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32085.log.gz 2026-03-24T12:09:12.778 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60442.log.gz 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63674.log 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26426.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60657.log 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26426.log.gz 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63674.log.gz 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52578.log 2026-03-24T12:09:12.779 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35940.log 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60657.log: /var/log/ceph/ceph-client.admin.52578.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67368.log 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52578.log.gz 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60657.log.gz 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35940.log.gz 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57752.log 2026-03-24T12:09:12.780 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62751.log 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67368.log.gz 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27072.log 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57752.log.gz/var/log/ceph/ceph-client.admin.62751.log: 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62751.log.gz 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66684.log 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56411.log 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27072.log: /var/log/ceph/ceph-client.admin.66684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62209.log 2026-03-24T12:09:12.781 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27072.log.gz 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56411.log: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.66684.log.gz 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56411.log.gz 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88214.log 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62209.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65110.log 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62209.log.gz 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88214.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77834.log 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88214.log.gz 2026-03-24T12:09:12.782 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80874.log 2026-03-24T12:09:12.783 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65110.log: /var/log/ceph/ceph-client.admin.77834.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46019.log 2026-03-24T12:09:12.783 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77834.log.gz 2026-03-24T12:09:12.783 INFO:teuthology.orchestra.run.vm05.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.65110.log.gz 2026-03-24T12:09:12.783 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80874.log.gz 2026-03-24T12:09:12.783 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33144.log 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50457.log 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46019.log.gz 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33144.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32682.log 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33144.log.gz 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50457.log.gz 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49286.log 2026-03-24T12:09:12.784 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32682.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76556.log 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32682.log.gz 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49286.log.gz 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58826.log 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76556.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75456.log 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76556.log.gz 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58826.log: gzip -5 0.0% --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.58826.log.gz /var/log/ceph/ceph-client.admin.34452.log 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.785 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80896.log 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75456.log: /var/log/ceph/ceph-client.admin.34452.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55287.log 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75456.log.gz 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.34452.log.gz 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80896.log.gz 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79751.log 2026-03-24T12:09:12.786 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29703.log 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55287.log.gz 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79751.log.gz 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40954.log 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29703.log.gz 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64580.log 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40954.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45933.log 2026-03-24T12:09:12.787 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40954.log.gz 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64580.log.gzgzip 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.32051.log 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58887.log 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45933.log.gz 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47964.log 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32051.log: /var/log/ceph/ceph-client.admin.58887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58887.log.gz 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58845.log 2026-03-24T12:09:12.788 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32051.log.gz 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42938.log 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47964.log: /var/log/ceph/ceph-client.admin.58845.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47964.log.gz 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58845.log.gz 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37079.log 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42938.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76688.log 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42938.log.gz 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37079.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72467.log 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37079.log.gz 2026-03-24T12:09:12.789 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30692.log 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76688.log.gz 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81414.log 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72467.log.gz 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30692.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71653.log 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr: 56.4% -- replaced with /var/log/ceph/ceph-client.admin.30692.log.gz 2026-03-24T12:09:12.790 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27709.log 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81414.log: /var/log/ceph/ceph-client.admin.71653.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81414.log.gz 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53801.log 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71653.log.gz 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27709.log.gz 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27528.log 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69169.log 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53801.log.gz 2026-03-24T12:09:12.791 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66831.log 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27528.log.gz 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69169.log.gz 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36552.log 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29295.log 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66831.log: /var/log/ceph/ceph-client.admin.36552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47469.log 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66831.log.gz 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36552.log.gz 2026-03-24T12:09:12.792 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29295.log.gz 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59517.log 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47469.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54010.log 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47469.log.gz 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59517.log.gz 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76513.log 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28094.log 2026-03-24T12:09:12.793 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54010.log: /var/log/ceph/ceph-client.admin.76513.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78985.log 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76513.log.gz 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr: 28.6% -- replaced with /var/log/ceph/ceph-client.admin.54010.log.gz 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28094.log.gz 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49803.log 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30755.log 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78985.log.gz 2026-03-24T12:09:12.794 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40932.log 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49803.log.gz 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30755.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85016.log 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30755.log.gz 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41781.log 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40932.log.gz 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85016.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89585.log 2026-03-24T12:09:12.795 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85016.log.gz 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41781.log.gz 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29037.log 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76363.log 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89585.log.gz 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29037.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86749.log 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29037.log.gz 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76363.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73134.log 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76363.log.gz 2026-03-24T12:09:12.796 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49566.log 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86749.log.gz 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73134.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90445.log 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73134.log.gz 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49566.log.gz 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89843.log 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89478.log 2026-03-24T12:09:12.797 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90445.log.gz 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28540.log 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89843.log.gz 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89478.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89478.log.gz 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79305.log 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38087.log 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28540.log.gz 2026-03-24T12:09:12.798 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79857.log 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79305.log.gz/var/log/ceph/ceph-client.admin.38087.log: 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28557.log 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38087.log.gz 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72265.log 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79857.log.gz 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55136.log 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28557.log.gz 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72265.log.gz 2026-03-24T12:09:12.799 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50147.log 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35668.log 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55136.log.gz 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50147.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85188.log 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50147.log.gz 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35668.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33349.log 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35668.log.gz 2026-03-24T12:09:12.800 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85188.log.gz 2026-03-24T12:09:12.801 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73112.log 2026-03-24T12:09:12.801 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89366.log 2026-03-24T12:09:12.801 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33349.log.gz 2026-03-24T12:09:12.801 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73112.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53456.log 2026-03-24T12:09:12.801 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73112.log.gz 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89366.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76169.log 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89366.log.gz 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83700.log 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53456.log: /var/log/ceph/ceph-client.admin.76169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53456.log.gz 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40341.log 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76169.log.gz 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83700.log.gz 2026-03-24T12:09:12.802 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41534.log 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26987.log 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40341.log: /var/log/ceph/ceph-client.admin.41534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41534.log.gz 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49307.log 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr: 67.9% -- replaced with /var/log/ceph/ceph-client.admin.40341.log.gz 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26987.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53760.log 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26987.log.gz 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49307.log.gz 2026-03-24T12:09:12.803 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27814.log 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32972.log 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53760.log.gz 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47579.log 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27814.log: 0.0%/var/log/ceph/ceph-client.admin.32972.log: -- replaced with /var/log/ceph/ceph-client.admin.27814.log.gz 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77625.log 2026-03-24T12:09:12.804 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32972.log.gz 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73955.log 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47579.log.gz 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52729.log 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77625.log.gz 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73955.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84387.log 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73955.log.gz 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49674.log 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52729.log.gz 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72694.log 2026-03-24T12:09:12.805 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84387.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84387.log.gz 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49674.log.gz 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85693.log 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65614.log 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72694.log.gz 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42005.log 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85693.log.gz 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65614.log.gz 2026-03-24T12:09:12.806 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32376.log 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35252.log 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42005.log.gz 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90209.log 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32376.log: /var/log/ceph/ceph-client.admin.35252.log: gzip 0.0% -5 --verbose -- -- replaced with /var/log/ceph/ceph-client.admin.35252.log.gz /var/log/ceph/ceph-client.admin.48644.log 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32376.log.gz 2026-03-24T12:09:12.807 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90209.log.gz 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58563.log 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28701.log 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48644.log.gz 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25640.log 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58563.log.gz 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28701.log.gz 2026-03-24T12:09:12.808 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78113.log 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75042.log 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25640.log.gz 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32665.log 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78113.log.gz 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75042.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59929.log 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75042.log.gz 2026-03-24T12:09:12.809 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55244.log 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32665.log: /var/log/ceph/ceph-client.admin.59929.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58219.log 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32665.log.gz 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55244.log: 10.7% -- replaced with /var/log/ceph/ceph-client.admin.59929.log.gz 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55244.log.gz 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48129.log 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58219.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69276.log 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58219.log.gz 2026-03-24T12:09:12.810 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48129.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.48129.log.gz 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.30154.log 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48541.log 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69276.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69276.log.gz 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86359.log 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30154.log.gz 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48541.log.gz 2026-03-24T12:09:12.811 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34072.log 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56243.log 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86359.log.gz 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64756.log 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34072.log.gz 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56243.log.gz 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87784.log 2026-03-24T12:09:12.812 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67217.log 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64756.log: /var/log/ceph/ceph-client.admin.87784.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80831.log 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64756.log.gz 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87784.log.gz 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67217.log.gz 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29767.log 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42774.log 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80831.log.gz 2026-03-24T12:09:12.813 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36198.log 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29767.log.gz 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42774.log.gz 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27627.log 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28266.log 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36198.log.gz 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79699.log 2026-03-24T12:09:12.814 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27627.log.gz 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28266.log.gz 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87956.log 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49975.log 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79699.log.gz 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82009.log 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87956.log.gz 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49975.log.gz 2026-03-24T12:09:12.815 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31643.log 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41864.log 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82009.log.gz 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67260.log 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31643.log: /var/log/ceph/ceph-client.admin.41864.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35495.log 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31643.log.gz 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.41864.log.gz 2026-03-24T12:09:12.816 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67260.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75682.log 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67260.log.gz 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35495.log.gz 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55093.log 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75682.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69814.log 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75682.log.gz 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55093.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73521.log 2026-03-24T12:09:12.817 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55093.log.gz 2026-03-24T12:09:12.818 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69814.log.gz 2026-03-24T12:09:12.818 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75437.log 2026-03-24T12:09:12.818 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60678.log 2026-03-24T12:09:12.818 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73521.log.gz 2026-03-24T12:09:12.818 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75437.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33161.log 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35172.log 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60678.log.gz 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33161.log: -- replaced with /var/log/ceph/ceph-client.admin.75437.log.gz 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33161.log.gz 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51036.log 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86188.log 2026-03-24T12:09:12.819 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35172.log.gz 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64464.log 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51036.log.gz 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86188.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.86188.log.gz -5 --verbose 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.79527.log 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90940.log 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64464.log: /var/log/ceph/ceph-client.admin.79527.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61799.log 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79527.log.gz 2026-03-24T12:09:12.820 INFO:teuthology.orchestra.run.vm05.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.64464.log.gz 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90940.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90940.log.gz 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89219.log 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61259.log 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61799.log.gz 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89219.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66959.log 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89219.log.gz 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61259.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30717.log 2026-03-24T12:09:12.821 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61259.log.gz 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85269.log 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66959.log: /var/log/ceph/ceph-client.admin.30717.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53137.log 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr: 55.3% -- replaced with /var/log/ceph/ceph-client.admin.66959.log.gz 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85269.log: 0.0% 26.4% -- replaced with /var/log/ceph/ceph-client.admin.85269.log.gz 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.30717.log.gz 2026-03-24T12:09:12.822 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66664.log 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65347.log 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53137.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53137.log.gz 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66664.log.gz 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66066.log 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65347.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59398.log 2026-03-24T12:09:12.823 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65347.log.gz 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40127.log 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66066.log.gz 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59398.log.gz 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27667.log 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40127.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50585.log 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40127.log.gz 2026-03-24T12:09:12.824 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74491.log 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27667.log.gz 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50585.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65958.log 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50585.log.gz 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74491.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41405.log 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74491.log.gz 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56818.log 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65958.log.gz 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41405.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41405.log.gz -5 2026-03-24T12:09:12.825 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.82478.log 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56818.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55653.log 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56818.log.gz 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76040.log 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82478.log: /var/log/ceph/ceph-client.admin.55653.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82478.log.gz 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43371.log 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55653.log.gz 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76040.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68873.log 2026-03-24T12:09:12.826 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76040.log.gz 2026-03-24T12:09:12.827 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43371.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66873.log 2026-03-24T12:09:12.827 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43371.log.gz 2026-03-24T12:09:12.827 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83101.log 2026-03-24T12:09:12.827 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68873.log: /var/log/ceph/ceph-client.admin.66873.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.68873.log.gz 2026-03-24T12:09:12.827 INFO:teuthology.orchestra.run.vm05.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.66873.log.gz 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59125.log 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83101.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35872.log 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83101.log.gz 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59125.log.gz 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79943.log 2026-03-24T12:09:12.831 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27793.log 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35872.log.gz 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33178.log 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79943.log.gz 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27793.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49760.log 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27793.log.gz 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33492.log 2026-03-24T12:09:12.832 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33178.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31364.log 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.33178.log.gz 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49760.log: /var/log/ceph/ceph-client.admin.33492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33492.log.gz 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49760.log.gz 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91218.log 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65808.log 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31364.log.gz 2026-03-24T12:09:12.833 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36079.log 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91218.log.gz/var/log/ceph/ceph-client.admin.65808.log: 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65808.log.gz 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76731.log 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78873.log 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36079.log: /var/log/ceph/ceph-client.admin.76731.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28908.log 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76731.log.gz 2026-03-24T12:09:12.834 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36079.log.gz 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78873.log.gz 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81436.log 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62772.log 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28908.log.gz 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41513.log 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81436.log.gz 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62772.log.gz 2026-03-24T12:09:12.835 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90875.log 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85530.log 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41513.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41513.log.gz 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32734.log 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90875.log.gz 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85530.log.gz 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91025.log 2026-03-24T12:09:12.836 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73920.log 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32734.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60399.log 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91025.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32734.log.gz 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr: 0.0%/var/log/ceph/ceph-client.admin.73920.log: -- replaced with /var/log/ceph/ceph-client.admin.91025.log.gz 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73920.log.gz 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58984.log 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66001.log 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60399.log.gz 2026-03-24T12:09:12.837 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61065.log 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58984.log.gz/var/log/ceph/ceph-client.admin.66001.log: 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66001.log.gz 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27110.log 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34512.log 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61065.log.gz 2026-03-24T12:09:12.838 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38339.log 2026-03-24T12:09:12.839 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27110.log: /var/log/ceph/ceph-client.admin.34512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34512.log.gz 2026-03-24T12:09:12.839 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41966.log 2026-03-24T12:09:12.839 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27110.log.gz 2026-03-24T12:09:12.839 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61840.log 2026-03-24T12:09:12.839 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38339.log: /var/log/ceph/ceph-client.admin.41966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41966.log.gz 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79340.log 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38339.log.gz 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89322.log 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61840.log: 0.0%/var/log/ceph/ceph-client.admin.79340.log: -- replaced with /var/log/ceph/ceph-client.admin.61840.log.gz 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.79340.log.gz -5 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.57248.log 2026-03-24T12:09:12.840 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89322.log: 58.9% -- replaced with /var/log/ceph/ceph-client.admin.89322.log.gz 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59361.log 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69857.log 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57248.log: /var/log/ceph/ceph-client.admin.59361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57248.log.gz 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59361.log.gz 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29509.log 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69857.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71631.log 2026-03-24T12:09:12.841 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69857.log.gz 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29509.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81156.log 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29509.log.gz 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44212.log 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71631.log.gz 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27206.log 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81156.log.gz 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44212.log.gz 2026-03-24T12:09:12.842 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75170.log 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39001.log 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27206.log.gz 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69900.log 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75170.log.gz 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39001.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87935.log 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr: 24.2% -- replaced with /var/log/ceph/ceph-client.admin.39001.log.gzgzip 2026-03-24T12:09:12.843 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.25814.log 2026-03-24T12:09:12.844 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69900.log.gz 2026-03-24T12:09:12.844 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87935.log.gz 2026-03-24T12:09:12.844 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60033.log 2026-03-24T12:09:12.844 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78613.log 2026-03-24T12:09:12.844 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25814.log: /var/log/ceph/ceph-client.admin.60033.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42711.log 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60033.log.gz 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25814.log.gz 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78613.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78613.log.gz 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82521.log 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40203.log 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42711.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89499.log 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42711.log.gz 2026-03-24T12:09:12.845 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82521.log.gz/var/log/ceph/ceph-client.admin.40203.log: 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55524.log 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.40203.log.gz 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44877.log 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89499.log: /var/log/ceph/ceph-client.admin.55524.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64147.log 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89499.log.gz 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.55524.log.gz 2026-03-24T12:09:12.846 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44877.log.gz 2026-03-24T12:09:12.847 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30891.log 2026-03-24T12:09:12.847 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36130.log 2026-03-24T12:09:12.847 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64147.log: /var/log/ceph/ceph-client.admin.30891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30891.log.gz 2026-03-24T12:09:12.847 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29832.log 2026-03-24T12:09:12.847 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64147.log.gz 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36130.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85650.log 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36130.log.gz 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29832.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77920.log 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29832.log.gz 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85650.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28180.log 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85650.log.gz 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77920.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64070.log 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77920.log.gz 2026-03-24T12:09:12.848 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87278.log 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28180.log: /var/log/ceph/ceph-client.admin.64070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28180.log.gz 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64070.log.gz 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27566.log 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87278.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49157.log 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87278.log.gz 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27566.log.gz 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45307.log 2026-03-24T12:09:12.849 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57466.log 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49157.log.gz 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80423.log 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45307.log.gz 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57466.log.gz 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45826.log 2026-03-24T12:09:12.850 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64052.log 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80423.log.gz 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69341.log 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64052.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32904.log 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64052.log.gz 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45826.log: /var/log/ceph/ceph-client.admin.69341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45826.log.gz 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69341.log.gz 2026-03-24T12:09:12.851 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61859.log 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67948.log 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32904.log: /var/log/ceph/ceph-client.admin.61859.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84565.log 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61859.log.gz 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32904.log.gz 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67948.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67948.log.gz 2026-03-24T12:09:12.852 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86878.log 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56474.log 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84565.log.gz 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86878.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56754.log 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86878.log.gz 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77941.log 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85908.log 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.56474.log.gz 2026-03-24T12:09:12.853 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56754.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56754.log.gz 2026-03-24T12:09:12.854 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77941.log.gz 2026-03-24T12:09:12.854 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85715.log 2026-03-24T12:09:12.854 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52922.log 2026-03-24T12:09:12.854 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85908.log.gz 2026-03-24T12:09:12.854 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85715.log.gz 2026-03-24T12:09:12.855 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32716.log 2026-03-24T12:09:12.855 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77182.log 2026-03-24T12:09:12.855 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52922.log: /var/log/ceph/ceph-client.admin.32716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52922.log.gz 2026-03-24T12:09:12.855 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31728.log 2026-03-24T12:09:12.856 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77182.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50416.log 2026-03-24T12:09:12.856 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77182.log.gz 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32716.log.gz 2026-03-24T12:09:12.856 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.856 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31728.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31728.log.gz 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77769.log 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50416.log.gz 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60829.log 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77769.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46564.log 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77769.log.gz 2026-03-24T12:09:12.859 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60829.log.gz 2026-03-24T12:09:12.860 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57033.log 2026-03-24T12:09:12.860 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40975.log 2026-03-24T12:09:12.860 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46564.log.gz 2026-03-24T12:09:12.860 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31966.log 2026-03-24T12:09:12.860 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57033.log: /var/log/ceph/ceph-client.admin.40975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57033.log.gz 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.40975.log.gz -5 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.77225.log 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31966.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31966.log.gz 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65700.log 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83374.log 2026-03-24T12:09:12.861 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77225.log.gz 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49480.log 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65700.log.gz/var/log/ceph/ceph-client.admin.83374.log: 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44312.log 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr: 82.7% -- replaced with /var/log/ceph/ceph-client.admin.83374.log.gz 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77401.log 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49480.log.gz 2026-03-24T12:09:12.862 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84952.log 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44312.log: /var/log/ceph/ceph-client.admin.77401.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44312.log.gz 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77401.log.gz 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65636.log 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35372.log 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84952.log.gz 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65636.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65636.log.gz 2026-03-24T12:09:12.863 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90574.log 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35372.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79394.log 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35372.log.gz 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90574.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84973.log 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90574.log.gz 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76989.log 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79394.log.gz 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49631.log 2026-03-24T12:09:12.864 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84973.log.gz/var/log/ceph/ceph-client.admin.76989.log: 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76989.log.gz 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82218.log 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38654.log 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49631.log: /var/log/ceph/ceph-client.admin.82218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49631.log.gz 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82218.log.gz 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87913.log 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38654.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60979.log 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87913.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83165.log 2026-03-24T12:09:12.865 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87913.log.gz 2026-03-24T12:09:12.866 INFO:teuthology.orchestra.run.vm05.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38654.log.gz 2026-03-24T12:09:12.866 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60979.log.gz 2026-03-24T12:09:12.866 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43354.log 2026-03-24T12:09:12.866 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60076.log 2026-03-24T12:09:12.866 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83165.log.gz 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41384.log 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43354.log: /var/log/ceph/ceph-client.admin.60076.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31711.log 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60076.log.gz 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43354.log.gz 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41384.log.gz 2026-03-24T12:09:12.867 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40298.log 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44748.log 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31711.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88064.log 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40298.log: 0.0% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31711.log.gz 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.40298.log.gz 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44748.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44748.log.gz 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45092.log 2026-03-24T12:09:12.868 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83893.log 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88064.log.gz 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34212.log 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45092.log: /var/log/ceph/ceph-client.admin.83893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83893.log.gz 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79892.log 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.45092.log.gz 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78328.log 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34212.log: /var/log/ceph/ceph-client.admin.79892.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34212.log.gz 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79892.log.gz 2026-03-24T12:09:12.869 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78925.log 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78328.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65657.log 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78328.log.gz 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78925.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39575.log 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78925.log.gz 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55200.log 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65657.log.gz 2026-03-24T12:09:12.870 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66254.log 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39575.log: 0.0%/var/log/ceph/ceph-client.admin.55200.log: -- replaced with /var/log/ceph/ceph-client.admin.39575.log.gz 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55200.log.gz 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84151.log 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53180.log 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66254.log.gz 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77572.log 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84151.log.gz/var/log/ceph/ceph-client.admin.53180.log: 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53180.log.gz 2026-03-24T12:09:12.871 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83037.log 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29552.log 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77572.log: /var/log/ceph/ceph-client.admin.83037.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77572.log.gz 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83037.log.gz 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43772.log 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50692.log 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29552.log.gz 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43772.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50735.log 2026-03-24T12:09:12.872 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43772.log.gz 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29166.log 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50692.log.gz 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66237.log 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50735.log.gz/var/log/ceph/ceph-client.admin.29166.log: 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29166.log.gz 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81715.log 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58370.log 2026-03-24T12:09:12.873 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66237.log: /var/log/ceph/ceph-client.admin.81715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81715.log.gz 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40679.log 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr: 29.6% -- replaced with /var/log/ceph/ceph-client.admin.66237.log.gz 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71717.log 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58370.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58370.log.gz 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40679.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89342.log 2026-03-24T12:09:12.874 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.40679.log.gz 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71717.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33298.log 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71717.log.gz 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89342.log.gz 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37345.log 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53476.log 2026-03-24T12:09:12.875 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33298.log: /var/log/ceph/ceph-client.admin.37345.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72721.log 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% 26.8% -- replaced with /var/log/ceph/ceph-client.admin.37345.log.gz 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.33298.log.gz 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53476.log.gz 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69255.log 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28158.log 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72721.log.gz 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69255.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73718.log 2026-03-24T12:09:12.876 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69255.log.gz 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57958.log 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28158.log.gz 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27186.log 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73718.log.gz 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57958.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43511.log 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57958.log.gz 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27186.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89542.log 2026-03-24T12:09:12.877 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27186.log.gz 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37786.log 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43511.log.gz 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39468.log 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89542.log: 0.0%/var/log/ceph/ceph-client.admin.37786.log: -- replaced with /var/log/ceph/ceph-client.admin.89542.log.gz 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34832.log 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37786.log.gz 2026-03-24T12:09:12.878 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39468.log.gz 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81307.log 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34832.log.gz 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28137.log 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81307.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26598.log 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81307.log.gz 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28137.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60485.log 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28137.log.gz 2026-03-24T12:09:12.879 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29316.log 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26598.log.gz 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48623.log 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60485.log.gz/var/log/ceph/ceph-client.admin.29316.log: 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29316.log.gz 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33246.log 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33006.log 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48623.log.gz 2026-03-24T12:09:12.880 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57076.log 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33246.log: /var/log/ceph/ceph-client.admin.33006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57653.log 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33246.log.gz 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33006.log.gz 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57076.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26318.log 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57076.log.gz 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57653.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57653.log.gz 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29574.log 2026-03-24T12:09:12.881 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65403.log 2026-03-24T12:09:12.882 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26318.log.gz 2026-03-24T12:09:12.882 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29574.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59644.log 2026-03-24T12:09:12.882 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29574.log.gz 2026-03-24T12:09:12.882 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65403.log.gz 2026-03-24T12:09:12.882 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62440.log 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74042.log 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59644.log: /var/log/ceph/ceph-client.admin.62440.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72305.log 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62440.log.gz 26.0% -- replaced with /var/log/ceph/ceph-client.admin.59644.log.gz 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74042.log.gz 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32324.log 2026-03-24T12:09:12.883 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86642.log 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72305.log.gz 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45350.log 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32324.log: /var/log/ceph/ceph-client.admin.86642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86642.log.gz 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45546.log 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32324.log.gz 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75648.log 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45350.log.gz 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45546.log.gz 2026-03-24T12:09:12.884 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48835.log 2026-03-24T12:09:12.885 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75648.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73005.log 2026-03-24T12:09:12.885 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75648.log.gz 2026-03-24T12:09:12.885 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76645.log 2026-03-24T12:09:12.885 INFO:teuthology.orchestra.run.vm05.stderr: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.48835.log.gz 2026-03-24T12:09:12.885 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48463.log 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73005.log.gz 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76645.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26189.log 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76645.log.gz 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48463.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75847.log 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48463.log.gz 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26189.log.gz 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57544.log 2026-03-24T12:09:12.886 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82078.log 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75847.log.gz 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50757.log 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57544.log: 0.0%/var/log/ceph/ceph-client.admin.82078.log: -- replaced with /var/log/ceph/ceph-client.admin.57544.log.gz 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82078.log.gz 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43454.log 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44013.log 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50757.log: /var/log/ceph/ceph-client.admin.43454.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50757.log.gz 2026-03-24T12:09:12.887 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43454.log.gz 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87741.log 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61586.log 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44013.log: /var/log/ceph/ceph-client.admin.87741.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65091.log 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87741.log.gz 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.44013.log.gz 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61586.log.gz 2026-03-24T12:09:12.888 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53519.log 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85779.log 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65091.log.gz 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74904.log 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53519.log.gz/var/log/ceph/ceph-client.admin.85779.log: 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85779.log.gz 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37450.log 2026-03-24T12:09:12.889 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36181.log 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74904.log.gz 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51122.log 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37450.log: /var/log/ceph/ceph-client.admin.36181.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36181.log.gz 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.37450.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60721.log 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.890 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26361.log 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51122.log.gz 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34972.log 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60721.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.60721.log.gz -5 --verbose 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.69599.log 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26361.log.gz 2026-03-24T12:09:12.891 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34972.log.gz 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57897.log 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38528.log 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69599.log.gz 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39769.log 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57897.log: 0.0%/var/log/ceph/ceph-client.admin.38528.log: -- replaced with /var/log/ceph/ceph-client.admin.57897.log.gz 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72409.log 2026-03-24T12:09:12.892 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38528.log.gz 2026-03-24T12:09:12.893 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39769.log.gz 2026-03-24T12:09:12.893 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71395.log 2026-03-24T12:09:12.893 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63594.log 2026-03-24T12:09:12.893 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72409.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77333.log 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71395.log: 58.1% -- replaced with /var/log/ceph/ceph-client.admin.72409.log.gz 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr: 0.0%/var/log/ceph/ceph-client.admin.63594.log: -- replaced with /var/log/ceph/ceph-client.admin.71395.log.gz 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63594.log.gz 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34352.log 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66852.log 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77333.log.gz 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75187.log 2026-03-24T12:09:12.894 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34352.log.gz/var/log/ceph/ceph-client.admin.66852.log: 2026-03-24T12:09:12.895 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66852.log.gz 2026-03-24T12:09:12.895 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75418.log 2026-03-24T12:09:12.895 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75187.log: gzip 0.0% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.75187.log.gz -- 2026-03-24T12:09:12.895 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.50649.log 2026-03-24T12:09:12.895 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75418.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41984.log 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75418.log.gz 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57446.log 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50649.log: /var/log/ceph/ceph-client.admin.41984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50649.log.gz 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48400.log 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.41984.log.gz 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57446.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35600.log 2026-03-24T12:09:12.896 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.57446.log.gz 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48400.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26469.log 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48400.log.gz 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29080.log 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35600.log.gz 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89822.log 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26469.log: 0.0%/var/log/ceph/ceph-client.admin.29080.log: -- replaced with /var/log/ceph/ceph-client.admin.26469.log.gz 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33692.log 2026-03-24T12:09:12.897 INFO:teuthology.orchestra.run.vm05.stderr: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.29080.log.gz 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55309.log 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89822.log.gz 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33692.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67174.log 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33692.log.gz 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79681.log 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55309.log.gz 2026-03-24T12:09:12.898 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67174.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67174.log.gz 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41823.log 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36518.log 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79681.log: /var/log/ceph/ceph-client.admin.41823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79681.log.gz 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41823.log.gz 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85306.log 2026-03-24T12:09:12.899 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36518.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73198.log 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36518.log.gz 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85306.log.gz 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76234.log 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48665.log 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73198.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82757.log 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73198.log.gz 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76234.log.gz/var/log/ceph/ceph-client.admin.48665.log: 2026-03-24T12:09:12.900 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48665.log.gz 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57162.log 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31779.log 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82757.log.gz 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40720.log 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57162.log: 0.0%/var/log/ceph/ceph-client.admin.31779.log: -- replaced with /var/log/ceph/ceph-client.admin.57162.log.gz 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35072.log 2026-03-24T12:09:12.901 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31779.log.gz 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40720.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40720.log.gz 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64621.log 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35072.log.gz 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48942.log 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38297.log 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64621.log.gz 2026-03-24T12:09:12.902 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38024.log 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48942.log.gz/var/log/ceph/ceph-client.admin.38297.log: 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37807.log 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38297.log.gz 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26748.log 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38024.log: /var/log/ceph/ceph-client.admin.37807.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81899.log 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38024.log.gz 2026-03-24T12:09:12.903 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26748.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26748.log.gz 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr: 52.7% -- replaced with /var/log/ceph/ceph-client.admin.37807.log.gz 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26949.log 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90746.log 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81899.log: /var/log/ceph/ceph-client.admin.26949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81899.log.gz 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26949.log.gz 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90682.log 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90746.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43813.log 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90746.log.gz 2026-03-24T12:09:12.904 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90682.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.90682.log.gz -5 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.47903.log 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79235.log 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43813.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63758.log 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47903.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43813.log.gz 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47903.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79235.log.gz 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.905 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86513.log 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36858.log 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63758.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63758.log.gz 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53658.log 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86513.log: /var/log/ceph/ceph-client.admin.36858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86513.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36858.log.gz 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.906 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53679.log 2026-03-24T12:09:12.907 INFO:teuthology.orchestra.run.vm05.stderr:gzip/var/log/ceph/ceph-client.admin.53679.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.47082.log 2026-03-24T12:09:12.907 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53679.log.gz 2026-03-24T12:09:12.907 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53658.log.gz 2026-03-24T12:09:12.907 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53946.log 2026-03-24T12:09:12.907 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47082.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48581.log 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47082.log.gz 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53946.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81328.log 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53946.log.gz 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33127.log 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48581.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46170.log 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81328.log: -- replaced with /var/log/ceph/ceph-client.admin.48581.log.gz 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81328.log.gz 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33127.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46042.log 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33127.log.gz 2026-03-24T12:09:12.908 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33432.log 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46170.log.gz 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72881.log 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46042.log.gz 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33432.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33432.log.gz 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57184.log 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67325.log 2026-03-24T12:09:12.909 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72881.log.gz 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48792.log 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57184.log.gz 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67325.log.gz 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50334.log 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46439.log 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48792.log.gz 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84108.log 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50334.log.gz 2026-03-24T12:09:12.910 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46439.log.gz 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46910.log 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26662.log 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84108.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84108.log.gz 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84651.log 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46910.log.gz 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26662.log.gz 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42084.log 2026-03-24T12:09:12.911 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53904.log 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84651.log.gz 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38465.log 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42084.log.gz 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53904.log.gz 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46605.log 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68271.log 2026-03-24T12:09:12.912 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38465.log: /var/log/ceph/ceph-client.admin.46605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46605.log.gz 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43002.log 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.38465.log.gz 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27836.log 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68271.log.gz 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43002.log.gz 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35702.log 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56287.log 2026-03-24T12:09:12.913 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27836.log.gz 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35702.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30133.log 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35702.log.gz 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27358.log 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56287.log.gz 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64716.log 2026-03-24T12:09:12.914 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30133.log.gz 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88358.log 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27358.log.gz 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34132.log 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.64716.log.gz 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84694.log 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88358.log.gz 2026-03-24T12:09:12.915 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30650.log 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34132.log.gz 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84694.log.gz 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61431.log 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61409.log 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30650.log.gz 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85252.log 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61431.log.gz/var/log/ceph/ceph-client.admin.61409.log: 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61409.log.gz 2026-03-24T12:09:12.916 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66354.log 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50778.log 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85252.log.gz 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36603.log 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66354.log.gz 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50778.log.gz 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46210.log 2026-03-24T12:09:12.917 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60227.log 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36603.log.gz 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69071.log 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46210.log.gz 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60227.log.gz 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57691.log 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45221.log 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69071.log.gz 2026-03-24T12:09:12.918 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89800.log 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57691.log.gz 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45221.log.gz 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91068.log 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36739.log 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89800.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73284.log 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91068.log: -- replaced with /var/log/ceph/ceph-client.admin.89800.log.gz 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91068.log.gz 2026-03-24T12:09:12.919 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36739.log.gz 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46298.log 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71588.log 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73284.log.gz 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87050.log 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46298.log.gz 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71588.log.gz 2026-03-24T12:09:12.920 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35132.log 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90553.log 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87050.log.gz 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60807.log 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35132.log.gz/var/log/ceph/ceph-client.admin.90553.log: 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90553.log.gz 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61721.log 2026-03-24T12:09:12.921 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60162.log 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60807.log.gz 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41061.log 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61721.log.gz 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60162.log.gz 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90015.log 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41341.log 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41061.log.gz 2026-03-24T12:09:12.922 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44392.log 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90015.log.gz/var/log/ceph/ceph-client.admin.41341.log: 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41341.log.gz 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33672.log 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58692.log 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44392.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36450.log 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33672.log.gz 26.2% -- replaced with /var/log/ceph/ceph-client.admin.44392.log.gz 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58692.log.gz 2026-03-24T12:09:12.923 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40562.log 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42627.log 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36450.log.gz 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29016.log 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40562.log.gz/var/log/ceph/ceph-client.admin.42627.log: 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42627.log.gz 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50972.log 2026-03-24T12:09:12.924 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86491.log 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29016.log.gz 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35546.log 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50972.log.gz 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86491.log.gz 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84930.log 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68378.log 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35546.log.gz 2026-03-24T12:09:12.925 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56182.log 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84930.log.gz 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68378.log.gz 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91288.log 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45804.log 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56182.log.gz 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32580.log 2026-03-24T12:09:12.926 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91288.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91288.log.gz 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45804.log.gz 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26082.log 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84866.log 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32580.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78307.log 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26082.log: 0.0% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.26082.log.gz -- replaced with /var/log/ceph/ceph-client.admin.32580.log.gz 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.927 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84866.log.gz 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81285.log 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73413.log 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78307.log.gz 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27007.log 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81285.log.gz 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73413.log.gz 2026-03-24T12:09:12.928 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63025.log 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36300.log 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27007.log.gz 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55696.log 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63025.log.gz/var/log/ceph/ceph-client.admin.36300.log: 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36300.log.gz 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71309.log 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56775.log 2026-03-24T12:09:12.929 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55696.log.gz 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32393.log 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71309.log.gz/var/log/ceph/ceph-client.admin.56775.log: 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56775.log.gz 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45028.log 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54297.log 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32393.log: /var/log/ceph/ceph-client.admin.45028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45028.log.gz 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31063.log 2026-03-24T12:09:12.930 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32393.log.gz 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54297.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49609.log 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54297.log.gz 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31063.log.gz 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63992.log 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59282.log 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49609.log.gz 2026-03-24T12:09:12.931 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73818.log 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63992.log: /var/log/ceph/ceph-client.admin.59282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59282.log.gz 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32853.log 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr: 55.0% -- replaced with /var/log/ceph/ceph-client.admin.63992.log.gz 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35052.log 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73818.log: /var/log/ceph/ceph-client.admin.32853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73818.log.gz 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79252.log 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32853.log.gz 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35052.log.gz 2026-03-24T12:09:12.932 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33752.log 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79252.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72368.log 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79252.log.gz 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82288.log 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89607.log 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33752.log.gz 2026-03-24T12:09:12.933 INFO:teuthology.orchestra.run.vm05.stderr: 11.1%/var/log/ceph/ceph-client.admin.82288.log: -- replaced with /var/log/ceph/ceph-client.admin.72368.log.gz 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82288.log.gz 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89865.log 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27857.log 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89607.log.gz 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42067.log 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89865.log.gz 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27857.log.gz 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32273.log 2026-03-24T12:09:12.934 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27546.log 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42067.log.gz 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32000.log 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32273.log: /var/log/ceph/ceph-client.admin.27546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27546.log.gz 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42505.log 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32273.log.gz 2026-03-24T12:09:12.935 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32000.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32000.log.gz 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74393.log 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39296.log 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42505.log: /var/log/ceph/ceph-client.admin.74393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48379.log 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42505.log.gz 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74393.log.gz 2026-03-24T12:09:12.936 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39296.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39296.log.gz 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28051.log 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27338.log 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36164.log 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48379.log.gz 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28051.log.gz 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27338.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27338.log.gz 2026-03-24T12:09:12.937 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40184.log 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85951.log 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36164.log: /var/log/ceph/ceph-client.admin.40184.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76602.log 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36164.log.gz 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.40184.log.gz 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85951.log.gz 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28029.log 2026-03-24T12:09:12.938 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31915.log 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76602.log: /var/log/ceph/ceph-client.admin.28029.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62925.log 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76602.log.gz 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.28029.log.gz 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31915.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69534.log 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31915.log.gz 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67002.log 2026-03-24T12:09:12.939 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62925.log: /var/log/ceph/ceph-client.admin.69534.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71524.log 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62925.log.gz 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.69534.log.gz 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67002.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67002.log.gz 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81221.log 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61782.log 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71524.log: /var/log/ceph/ceph-client.admin.81221.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42425.log 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71524.log.gz 2026-03-24T12:09:12.940 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81221.log.gz 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61782.log.gz 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49178.log 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88150.log 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42425.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48234.log 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49178.log.gz 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88150.log: 59.4% -- replaced with /var/log/ceph/ceph-client.admin.42425.log.gz 2026-03-24T12:09:12.941 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88150.log.gz 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49200.log 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80767.log 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48234.log.gz 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49200.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83370.log 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49200.log.gz 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80767.log.gz 2026-03-24T12:09:12.942 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87115.log 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83370.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51315.log 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr: 72.5% -- replaced with /var/log/ceph/ceph-client.admin.83370.log.gz 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87115.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77963.log 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87115.log.gz 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51315.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87849.log 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.51315.log.gz 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27771.log 2026-03-24T12:09:12.943 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77963.log.gz 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:gzip/var/log/ceph/ceph-client.admin.87849.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.80358.log 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87849.log.gz 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27771.log.gz 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38108.log 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48088.log 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80358.log.gz 2026-03-24T12:09:12.944 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62791.log 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38108.log: /var/log/ceph/ceph-client.admin.48088.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38108.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48088.log.gz 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46975.log 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66173.log 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62791.log: /var/log/ceph/ceph-client.admin.46975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62791.log.gz 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46975.log.gz 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39554.log 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66173.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51726.log 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66173.log.gz 2026-03-24T12:09:12.945 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39554.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75134.log 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39554.log.gz 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33552.log 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51726.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51726.log.gz 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43631.log 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75134.log.gz 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33552.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54985.log 2026-03-24T12:09:12.946 INFO:teuthology.orchestra.run.vm05.stderr: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.33552.log.gz 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54985.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39532.log 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.54985.log.gz 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59340.log 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43631.log: /var/log/ceph/ceph-client.admin.39532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39532.log.gz 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90854.log 2026-03-24T12:09:12.947 INFO:teuthology.orchestra.run.vm05.stderr: 31.5% -- replaced with /var/log/ceph/ceph-client.admin.43631.log.gz 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59340.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58434.log 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59340.log.gz 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90854.log.gz 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26146.log 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75378.log 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58434.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58434.log.gz 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30111.log 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26146.log.gz 2026-03-24T12:09:12.948 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75378.log.gz 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68604.log 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30424.log 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30111.log.gz 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50500.log 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68604.log: /var/log/ceph/ceph-client.admin.30424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30424.log.gz 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54196.log 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.68604.log.gz 2026-03-24T12:09:12.949 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64970.log 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50500.log: /var/log/ceph/ceph-client.admin.54196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50500.log.gz 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54196.log.gz 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84522.log 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64970.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61877.log 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64970.log.gz 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84522.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30934.log 2026-03-24T12:09:12.950 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84522.log.gz 2026-03-24T12:09:12.951 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61877.log.gz 2026-03-24T12:09:12.951 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75631.log 2026-03-24T12:09:12.951 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49523.log 2026-03-24T12:09:12.951 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30934.log: /var/log/ceph/ceph-client.admin.75631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30934.log.gz 2026-03-24T12:09:12.951 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83979.log 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.75631.log.gz 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49523.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69921.log 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49523.log.gz 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37028.log 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83979.log: /var/log/ceph/ceph-client.admin.69921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69921.log.gz 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83979.log.gz 2026-03-24T12:09:12.952 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36722.log 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37028.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27396.log 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37028.log.gz 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36722.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40541.log 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr: 26.9% -- replaced with /var/log/ceph/ceph-client.admin.36722.log.gz 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30595.log 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27396.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61704.log 2026-03-24T12:09:12.953 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27396.log.gz 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30595.log: 67.1% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30595.log.gz 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.40541.log.gz 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26512.log 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36535.log 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61704.log: /var/log/ceph/ceph-client.admin.26512.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39984.log 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61704.log.gz 2026-03-24T12:09:12.954 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.26512.log.gz 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62571.log 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36535.log.gz 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88338.log 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39984.log.gz 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78221.log 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62571.log.gz 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88338.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86166.log 2026-03-24T12:09:12.955 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88338.log.gz 2026-03-24T12:09:12.956 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78221.log.gz 2026-03-24T12:09:12.956 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43912.log 2026-03-24T12:09:12.956 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83225.log 2026-03-24T12:09:12.956 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86166.log.gz 2026-03-24T12:09:12.956 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50228.log 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43912.log.gz 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83225.log.gz 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87260.log 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47297.log 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50228.log: /var/log/ceph/ceph-client.admin.87260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50228.log.gz 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87260.log.gz 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72940.log 2026-03-24T12:09:12.957 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47882.log 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47297.log.gz 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72940.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.72940.log.gz -5 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.46543.log 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90897.log 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47882.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80251.log 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47882.log.gz 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46543.log.gz/var/log/ceph/ceph-client.admin.90897.log: 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90897.log.gz 2026-03-24T12:09:12.958 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42246.log 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65593.log 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80251.log.gz 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80552.log 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42246.log: /var/log/ceph/ceph-client.admin.65593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65593.log.gz 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70027.log 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr: 53.0% -- replaced with /var/log/ceph/ceph-client.admin.42246.log.gz 2026-03-24T12:09:12.959 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49846.log 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80552.log: /var/log/ceph/ceph-client.admin.70027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80552.log.gz 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70027.log.gz 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82307.log 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49846.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61194.log 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49846.log.gz 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82307.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.60786.log -- replaced with /var/log/ceph/ceph-client.admin.82307.log.gz 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:12.960 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61130.log 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61194.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61194.log.gz 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54632.log 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60786.log.gz 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61130.log.gz 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57286.log 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86080.log 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54632.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58773.log 2026-03-24T12:09:12.961 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57286.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.54632.log.gz 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57286.log.gz 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86080.log.gz 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72805.log 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37387.log 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58773.log.gz 2026-03-24T12:09:12.962 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44034.log 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72805.log: /var/log/ceph/ceph-client.admin.37387.log: 56.2% 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37387.log.gz 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.72805.log.gz 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89429.log 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75563.log 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44034.log.gz 2026-03-24T12:09:12.963 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46193.log 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89429.log.gz/var/log/ceph/ceph-client.admin.75563.log: 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75563.log.gz 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46953.log 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66461.log 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46193.log.gz 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42026.log 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46953.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46953.log.gz/var/log/ceph/ceph-client.admin.66461.log: 2026-03-24T12:09:12.964 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66461.log.gz 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30025.log 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37883.log 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42026.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28930.log 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30025.log: 17.8% -- replaced with /var/log/ceph/ceph-client.admin.42026.log.gz 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr: 0.0%/var/log/ceph/ceph-client.admin.37883.log: -- replaced with /var/log/ceph/ceph-client.admin.30025.log.gz 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37883.log.gz 2026-03-24T12:09:12.965 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54835.log 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57227.log 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28930.log.gz 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80062.log 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54835.log.gz 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57227.log.gz 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45114.log 2026-03-24T12:09:12.966 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29445.log 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80062.log: /var/log/ceph/ceph-client.admin.45114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40241.log 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr: 56.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80062.log.gz 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.45114.log.gz 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29445.log.gz 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43412.log 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68810.log 2026-03-24T12:09:12.967 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40241.log.gz 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26555.log 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43412.log: /var/log/ceph/ceph-client.admin.68810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68810.log.gz 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30561.log 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.43412.log.gz 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76924.log 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26555.log: /var/log/ceph/ceph-client.admin.30561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26555.log.gz 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30561.log.gz 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.log 2026-03-24T12:09:12.968 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76924.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75699.log 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76924.log.gz 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36926.log 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83657.log 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75699.log.gz 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36926.log.gz 2026-03-24T12:09:12.969 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45264.log 2026-03-24T12:09:12.970 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30407.log 2026-03-24T12:09:12.970 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83657.log.gz 2026-03-24T12:09:12.970 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45264.log.gz 2026-03-24T12:09:12.970 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39597.log 2026-03-24T12:09:12.970 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46704.log 2026-03-24T12:09:12.971 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39597.log.gz/var/log/ceph/ceph-client.admin.30407.log: 2026-03-24T12:09:12.971 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.30407.log.gz 2026-03-24T12:09:12.971 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27943.log 2026-03-24T12:09:12.971 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38938.log 2026-03-24T12:09:12.971 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46704.log.gz 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27943.log.gz 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82972.log 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr: 90.0% -- replaced with /var/log/ceph/ceph.log.gz 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75734.log 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38938.log: /var/log/ceph/ceph-client.admin.82972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53266.log 2026-03-24T12:09:12.972 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82972.log.gz 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.38938.log.gz 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75734.log.gz 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54697.log 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59990.log 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53266.log.gz 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31745.log 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54697.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54697.log.gz/var/log/ceph/ceph-client.admin.59990.log: 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59990.log.gz 2026-03-24T12:09:12.973 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76752.log 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55567.log 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31745.log: /var/log/ceph/ceph-client.admin.76752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37513.log 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76752.log.gz 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31745.log.gz 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55567.log.gz 2026-03-24T12:09:12.974 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77855.log 2026-03-24T12:09:12.975 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37513.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37513.log.gz 2026-03-24T12:09:12.975 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84844.log 2026-03-24T12:09:12.975 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77855.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29380.log 2026-03-24T12:09:12.975 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77855.log.gz 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84844.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56560.log 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84844.log.gz 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68416.log 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29380.log: /var/log/ceph/ceph-client.admin.56560.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59557.log 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56560.log.gz 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr: 25.9% -- replaced with /var/log/ceph/ceph-client.admin.29380.log.gz 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68416.log.gz 2026-03-24T12:09:12.976 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78890.log 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74648.log 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59557.log.gz 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78890.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84022.log 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78890.log.gz 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74648.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36909.log 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74648.log.gz 2026-03-24T12:09:12.977 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84022.log.gz 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27168.log 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61818.log 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36909.log.gz 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67819.log 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27168.log.gz 2026-03-24T12:09:12.978 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61818.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47168.log 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61818.log.gz 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72983.log 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67819.log.gz 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47168.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47168.log.gz 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35292.log 2026-03-24T12:09:12.979 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61108.log 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72983.log.gz 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27490.log 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35292.log.gz 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61108.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36671.log 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61108.log.gz 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27490.log.gz 2026-03-24T12:09:12.980 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59107.log 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51144.log 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59107.log: /var/log/ceph/ceph-client.admin.36671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59107.log.gz 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36671.log.gz 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.0.log 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51144.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84500.log 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51144.log.gz 2026-03-24T12:09:12.981 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76384.log 2026-03-24T12:09:12.982 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64284.log 2026-03-24T12:09:12.982 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84500.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84500.log.gz 2026-03-24T12:09:12.982 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76384.log.gz 2026-03-24T12:09:12.991 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80466.log 2026-03-24T12:09:12.991 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71330.log 2026-03-24T12:09:12.991 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64284.log.gz 2026-03-24T12:09:12.991 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80466.log.gz 2026-03-24T12:09:12.992 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44899.log 2026-03-24T12:09:12.992 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45976.log 2026-03-24T12:09:12.992 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71330.log.gz 2026-03-24T12:09:12.992 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44899.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50929.log 2026-03-24T12:09:12.992 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44899.log.gz 2026-03-24T12:09:12.993 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45976.log.gz 2026-03-24T12:09:12.993 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59947.log 2026-03-24T12:09:12.993 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78732.log 2026-03-24T12:09:12.993 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50929.log.gz 2026-03-24T12:09:12.993 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59947.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46149.log 2026-03-24T12:09:12.994 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59947.log.gz 2026-03-24T12:09:12.994 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78732.log.gz 2026-03-24T12:09:12.994 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35685.log 2026-03-24T12:09:12.994 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67970.log 2026-03-24T12:09:12.994 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46149.log.gz 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35685.log.gz 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32785.log 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60205.log 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67970.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67970.log.gz 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32785.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39403.log 2026-03-24T12:09:12.995 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32785.log.gz 2026-03-24T12:09:12.996 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60205.log.gz 2026-03-24T12:09:12.996 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83144.log 2026-03-24T12:09:12.996 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44576.log 2026-03-24T12:09:12.996 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39403.log.gz 2026-03-24T12:09:12.996 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83144.log.gz 2026-03-24T12:09:12.997 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31898.log 2026-03-24T12:09:12.997 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62675.log 2026-03-24T12:09:12.997 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44576.log: /var/log/ceph/ceph-client.admin.31898.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.44576.log.gz 2026-03-24T12:09:12.997 INFO:teuthology.orchestra.run.vm05.stderr: 1.2%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72640.log 2026-03-24T12:09:12.997 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.31898.log.gz 2026-03-24T12:09:12.998 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62675.log.gz 2026-03-24T12:09:12.998 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38864.log 2026-03-24T12:09:12.998 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37219.log 2026-03-24T12:09:12.998 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72640.log.gz 2026-03-24T12:09:12.998 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38864.log: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.38864.log.gz 2026-03-24T12:09:12.999 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34872.log 2026-03-24T12:09:12.999 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40499.log 2026-03-24T12:09:12.999 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37219.log: /var/log/ceph/ceph-client.admin.34872.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37219.log.gz 2026-03-24T12:09:12.999 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34872.log.gz 2026-03-24T12:09:12.999 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42878.log 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25718.log 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40499.log.gz 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42878.log.gz 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37597.log 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45243.log 2026-03-24T12:09:13.000 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25718.log.gz 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37597.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34852.log 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37597.log.gz 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45243.log.gz 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77642.log 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34852.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75244.log 2026-03-24T12:09:13.001 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34852.log.gz 2026-03-24T12:09:13.002 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77642.log.gz 2026-03-24T12:09:13.002 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63223.log 2026-03-24T12:09:13.002 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32478.log 2026-03-24T12:09:13.002 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75244.log.gz 2026-03-24T12:09:13.002 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63223.log.gz 2026-03-24T12:09:13.003 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33281.log 2026-03-24T12:09:13.003 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48274.log 2026-03-24T12:09:13.003 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32478.log: /var/log/ceph/ceph-client.admin.33281.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32478.log.gz 2026-03-24T12:09:13.003 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33281.log.gz 2026-03-24T12:09:13.003 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46803.log 2026-03-24T12:09:13.004 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90510.log 2026-03-24T12:09:13.004 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46803.log.gz/var/log/ceph/ceph-client.admin.48274.log: 2026-03-24T12:09:13.004 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48274.log.gz 2026-03-24T12:09:13.004 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90037.log 2026-03-24T12:09:13.005 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77661.log 2026-03-24T12:09:13.005 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90510.log.gz 2026-03-24T12:09:13.005 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90037.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90037.log.gz 2026-03-24T12:09:13.018 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78555.log 2026-03-24T12:09:13.019 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77661.log.gz 2026-03-24T12:09:13.019 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86295.log 2026-03-24T12:09:13.019 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78555.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68477.log 2026-03-24T12:09:13.019 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78555.log.gz 2026-03-24T12:09:13.019 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86295.log.gz 2026-03-24T12:09:13.020 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79644.log 2026-03-24T12:09:13.020 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35478.log 2026-03-24T12:09:13.020 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68477.log.gz 2026-03-24T12:09:13.021 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79644.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.79644.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.84801.log 2026-03-24T12:09:13.021 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.021 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35478.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35478.log.gz 2026-03-24T12:09:13.021 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79625.log 2026-03-24T12:09:13.022 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31214.log 2026-03-24T12:09:13.022 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84801.log.gz 2026-03-24T12:09:13.022 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79625.log.gz 2026-03-24T12:09:13.022 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62945.log 2026-03-24T12:09:13.022 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83534.log 2026-03-24T12:09:13.023 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31214.log.gz 2026-03-24T12:09:13.023 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62945.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62945.log.gz 2026-03-24T12:09:13.023 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58413.log 2026-03-24T12:09:13.023 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27965.log 2026-03-24T12:09:13.023 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83534.log.gz 2026-03-24T12:09:13.024 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58413.log.gz 2026-03-24T12:09:13.024 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63263.log 2026-03-24T12:09:13.024 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29123.log 2026-03-24T12:09:13.024 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27965.log.gz 2026-03-24T12:09:13.025 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63263.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74921.log 2026-03-24T12:09:13.025 INFO:teuthology.orchestra.run.vm05.stderr: 57.7% -- replaced with /var/log/ceph/ceph-client.admin.63263.log.gz 2026-03-24T12:09:13.025 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29123.log: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.29123.log.gz 2026-03-24T12:09:13.025 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79769.log 2026-03-24T12:09:13.026 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39106.log 2026-03-24T12:09:13.026 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74921.log.gz 2026-03-24T12:09:13.026 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79769.log.gz 2026-03-24T12:09:13.026 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71889.log 2026-03-24T12:09:13.027 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39106.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.39106.log.gz 2026-03-24T12:09:13.027 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37408.log 2026-03-24T12:09:13.027 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35192.log 2026-03-24T12:09:13.027 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71889.log.gz 2026-03-24T12:09:13.028 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37408.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63109.log 2026-03-24T12:09:13.028 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37408.log.gz 2026-03-24T12:09:13.028 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35192.log.gz 2026-03-24T12:09:13.028 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87007.log 2026-03-24T12:09:13.029 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63109.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27262.log 2026-03-24T12:09:13.029 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63109.log.gz 2026-03-24T12:09:13.029 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87007.log.gz 2026-03-24T12:09:13.029 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56035.log 2026-03-24T12:09:13.030 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27262.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27262.log.gz 2026-03-24T12:09:13.030 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43251.log 2026-03-24T12:09:13.030 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45611.log 2026-03-24T12:09:13.030 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56035.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56035.log.gz 2026-03-24T12:09:13.031 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43251.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43251.log.gz 2026-03-24T12:09:13.031 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50354.log 2026-03-24T12:09:13.031 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32017.log 2026-03-24T12:09:13.031 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45611.log.gz 2026-03-24T12:09:13.031 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50354.log.gz 2026-03-24T12:09:13.032 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47276.log 2026-03-24T12:09:13.032 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32017.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32017.log.gz 2026-03-24T12:09:13.032 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82346.log 2026-03-24T12:09:13.033 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53336.log 2026-03-24T12:09:13.033 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47276.log: /var/log/ceph/ceph-client.admin.82346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47276.log.gz 2026-03-24T12:09:13.033 INFO:teuthology.orchestra.run.vm05.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.82346.log.gz 2026-03-24T12:09:13.033 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44856.log 2026-03-24T12:09:13.034 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63285.log 2026-03-24T12:09:13.034 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53336.log.gz 2026-03-24T12:09:13.034 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44856.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44856.log.gz 2026-03-24T12:09:13.034 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40163.log 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77522.log 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63285.log: /var/log/ceph/ceph-client.admin.40163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63285.log.gz 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.40163.log.gz 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32802.log 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61569.log 2026-03-24T12:09:13.035 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77522.log.gz 2026-03-24T12:09:13.036 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32802.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28223.log 2026-03-24T12:09:13.036 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32802.log.gz 2026-03-24T12:09:13.036 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61569.log.gz 2026-03-24T12:09:13.036 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47190.log 2026-03-24T12:09:13.036 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38696.log 2026-03-24T12:09:13.037 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28223.log.gz 2026-03-24T12:09:13.037 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47190.log.gz 2026-03-24T12:09:13.037 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74339.log 2026-03-24T12:09:13.037 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32529.log 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38696.log: /var/log/ceph/ceph-client.admin.74339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74339.log.gz 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.38696.log.gz 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50563.log 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67561.log 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32529.log: /var/log/ceph/ceph-client.admin.50563.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50563.log.gz 1.2% 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.32529.log.gz 2026-03-24T12:09:13.038 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32341.log 2026-03-24T12:09:13.039 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78006.log 2026-03-24T12:09:13.039 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67561.log.gz 2026-03-24T12:09:13.039 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32341.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74356.log 2026-03-24T12:09:13.039 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32341.log.gz 2026-03-24T12:09:13.039 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78006.log.gz 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75492.log 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64833.log 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74356.log.gz 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75492.log.gz 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31813.log 2026-03-24T12:09:13.040 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61899.log 2026-03-24T12:09:13.041 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64833.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64833.log.gz 2026-03-24T12:09:13.041 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31813.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90316.log 2026-03-24T12:09:13.041 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.31813.log.gz 2026-03-24T12:09:13.041 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61899.log.gz 2026-03-24T12:09:13.041 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75474.log 2026-03-24T12:09:13.042 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26017.log 2026-03-24T12:09:13.042 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90316.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90316.log.gz 2026-03-24T12:09:13.042 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75474.log.gz 2026-03-24T12:09:13.042 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42547.log 2026-03-24T12:09:13.042 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87634.log 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26017.log.gz 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42547.log.gz 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78414.log 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40459.log 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87634.log.gz 2026-03-24T12:09:13.043 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78414.log.gz 2026-03-24T12:09:13.044 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48026.log 2026-03-24T12:09:13.044 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71989.log 2026-03-24T12:09:13.044 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40459.log.gz 2026-03-24T12:09:13.044 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48026.log.gz 2026-03-24T12:09:13.044 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76212.log 2026-03-24T12:09:13.045 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57352.log 2026-03-24T12:09:13.045 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71989.log.gz 2026-03-24T12:09:13.045 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76212.log.gz 2026-03-24T12:09:13.045 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72079.log 2026-03-24T12:09:13.045 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87400.log 2026-03-24T12:09:13.046 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57352.log.gz 2026-03-24T12:09:13.046 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72079.log: 85.6% -- replaced with /var/log/ceph/ceph-client.admin.72079.log.gz 2026-03-24T12:09:13.046 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32546.log 2026-03-24T12:09:13.046 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45761.log 2026-03-24T12:09:13.046 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87400.log.gz 2026-03-24T12:09:13.047 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32546.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71416.log 2026-03-24T12:09:13.047 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32546.log.gz 2026-03-24T12:09:13.047 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45761.log.gz 2026-03-24T12:09:13.047 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54419.log 2026-03-24T12:09:13.048 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81737.log 2026-03-24T12:09:13.048 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71416.log.gz 2026-03-24T12:09:13.048 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54419.log.gz 2026-03-24T12:09:13.048 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54792.log 2026-03-24T12:09:13.048 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80509.log 2026-03-24T12:09:13.049 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81737.log.gz 2026-03-24T12:09:13.049 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54792.log.gz 2026-03-24T12:09:13.049 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43693.log 2026-03-24T12:09:13.049 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37492.log 2026-03-24T12:09:13.049 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80509.log.gz 2026-03-24T12:09:13.050 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43693.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74551.log 2026-03-24T12:09:13.050 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43693.log.gz 2026-03-24T12:09:13.050 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37492.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37492.log.gz 2026-03-24T12:09:13.050 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55373.log 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74551.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74551.log.gz 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32631.log 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr: 92.1% -- replaced with /var/log/ceph/ceph-mgr.x.log.gz 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55373.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82169.log 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55373.log.gz 2026-03-24T12:09:13.051 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48707.log 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32631.log: /var/log/ceph/ceph-client.admin.82169.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60377.log 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82169.log.gz 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32631.log.gz 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48707.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48707.log.gz 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32597.log 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28244.log 2026-03-24T12:09:13.052 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60377.log.gz 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32597.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56625.log 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32597.log.gz 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28244.log.gz 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59322.log 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56625.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73456.log 2026-03-24T12:09:13.053 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56625.log.gz 2026-03-24T12:09:13.054 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46846.log 2026-03-24T12:09:13.054 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59322.log: 10.7%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61607.log 2026-03-24T12:09:13.054 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.59322.log.gz 2026-03-24T12:09:13.054 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73456.log.gz 2026-03-24T12:09:13.054 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46846.log.gz 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34752.log 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29810.log 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61607.log.gz 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34752.log.gz 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50907.log 2026-03-24T12:09:13.055 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29810.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55114.log 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29810.log.gz 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30262.log 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50907.log.gz 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55114.log.gz 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33712.log 2026-03-24T12:09:13.056 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30262.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52600.log 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30262.log.gz 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33712.log.gz 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88856.log 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50886.log 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52600.log.gz 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88856.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62534.log 2026-03-24T12:09:13.057 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88856.log.gz 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60184.log 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50886.log.gz 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45847.log 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62534.log.gz 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60184.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64523.log 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60184.log.gz 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45847.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76492.log 2026-03-24T12:09:13.058 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45847.log.gz 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52944.log 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64523.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64523.log.gz 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76492.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62071.log 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76492.log.gz 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52944.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34232.log 2026-03-24T12:09:13.059 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52944.log.gz 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82133.log 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62071.log: /var/log/ceph/ceph-client.admin.34232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59765.log 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34232.log.gz 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr: 52.1% -- replaced with /var/log/ceph/ceph-client.admin.62071.log.gz 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82133.log.gz 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77812.log 2026-03-24T12:09:13.060 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82236.log 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59765.log: /var/log/ceph/ceph-client.admin.77812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59765.log.gz 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42647.log 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77812.log.gz 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82236.log: gzip -5 0.0% --verbose -- replaced with /var/log/ceph/ceph-client.admin.82236.log.gz -- 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.76083.log 2026-03-24T12:09:13.061 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90596.log 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42647.log.gz 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76083.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50843.log 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76083.log.gz 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90596.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27414.log 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90596.log.gz 2026-03-24T12:09:13.062 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64111.log 2026-03-24T12:09:13.063 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27414.log.gz 2026-03-24T12:09:13.063 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50843.log.gz 2026-03-24T12:09:13.063 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64930.log 2026-03-24T12:09:13.063 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64111.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35889.log 2026-03-24T12:09:13.063 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64111.log.gz 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64930.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62885.log 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64930.log.gz 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65171.log 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35889.log: /var/log/ceph/ceph-client.admin.62885.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35889.log.gz 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62885.log.gz 2026-03-24T12:09:13.064 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56517.log 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65171.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51706.log 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.65171.log.gz 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56517.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86252.log 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56517.log.gz 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59378.log 2026-03-24T12:09:13.065 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51706.log: /var/log/ceph/ceph-client.admin.86252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51706.log.gz 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58133.log 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86252.log.gz 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59378.log.gz 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80788.log 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73692.log 2026-03-24T12:09:13.066 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58133.log: /var/log/ceph/ceph-client.admin.80788.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56371.log 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58133.log.gz 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80788.log.gz 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73692.log.gz 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32068.log 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57835.log 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56371.log.gz 2026-03-24T12:09:13.067 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44476.log 2026-03-24T12:09:13.068 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32068.log: /var/log/ceph/ceph-client.admin.57835.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.57835.log.gz 2026-03-24T12:09:13.068 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.49372.log 2026-03-24T12:09:13.068 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32068.log.gz 2026-03-24T12:09:13.068 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44476.log.gz 2026-03-24T12:09:13.068 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46244.log 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84130.log 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49372.log.gz 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58735.log 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46244.log.gz 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84130.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66542.log 2026-03-24T12:09:13.069 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84130.log.gz 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63457.log 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58735.log.gz 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31321.log 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66542.log.gz 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63457.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74864.log 2026-03-24T12:09:13.070 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63457.log.gz 2026-03-24T12:09:13.071 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31321.log.gz 2026-03-24T12:09:13.071 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84173.log 2026-03-24T12:09:13.071 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57607.log 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74864.log.gz 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84173.log.gz 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77053.log 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82800.log 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57607.log: /var/log/ceph/ceph-client.admin.77053.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40005.log 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77053.log.gz 2026-03-24T12:09:13.072 INFO:teuthology.orchestra.run.vm05.stderr: 43.2% -- replaced with /var/log/ceph/ceph-client.admin.57607.log.gz 2026-03-24T12:09:13.073 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82800.log.gz 2026-03-24T12:09:13.073 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26705.log 2026-03-24T12:09:13.073 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63514.log 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40005.log.gz 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62131.log 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26705.log.gz 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63514.log.gz 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65385.log 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36385.log 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62131.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62131.log.gz 2026-03-24T12:09:13.074 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54942.log 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65385.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65385.log.gz/var/log/ceph/ceph-client.admin.36385.log: 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36385.log.gz 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87029.log 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67905.log 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54942.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54942.log.gz 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87029.log.gz 2026-03-24T12:09:13.075 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78504.log 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67905.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77311.log 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67905.log.gz 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78504.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83191.log 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78504.log.gz 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48358.log 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77311.log.gz 2026-03-24T12:09:13.076 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82542.log 2026-03-24T12:09:13.077 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83191.log.gz/var/log/ceph/ceph-client.admin.48358.log: 2026-03-24T12:09:13.077 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48358.log.gz 2026-03-24T12:09:13.077 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32563.log 2026-03-24T12:09:13.077 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82542.log.gz 2026-03-24T12:09:13.077 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46781.log 2026-03-24T12:09:13.078 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71094.log 2026-03-24T12:09:13.078 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32563.log: /var/log/ceph/ceph-client.admin.46781.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42689.log 2026-03-24T12:09:13.078 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46781.log.gz 2026-03-24T12:09:13.078 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32563.log.gz 2026-03-24T12:09:13.078 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71094.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71094.log.gz 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49437.log 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37261.log 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42689.log.gz 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72847.log 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49437.log.gz 2026-03-24T12:09:13.079 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37261.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83285.log 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37261.log.gz 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80659.log 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72847.log.gz 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83285.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31128.log 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83285.log.gz 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80659.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76406.log 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80659.log.gz 2026-03-24T12:09:13.080 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57389.log 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31128.log.gz 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76406.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46128.log 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76406.log.gz 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57389.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35272.log 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57389.log.gz 2026-03-24T12:09:13.081 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46128.log.gz 2026-03-24T12:09:13.082 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47923.log 2026-03-24T12:09:13.082 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90123.log 2026-03-24T12:09:13.082 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35272.log.gz 2026-03-24T12:09:13.082 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47923.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48814.log 2026-03-24T12:09:13.082 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47923.log.gz 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90123.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64559.log 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90123.log.gz 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55223.log 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48814.log.gz 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47211.log 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64559.log.gz 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55223.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42226.log 2026-03-24T12:09:13.083 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55223.log.gz 2026-03-24T12:09:13.084 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47211.log.gz 2026-03-24T12:09:13.084 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87569.log 2026-03-24T12:09:13.084 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69749.log 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42226.log: /var/log/ceph/ceph-client.admin.87569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87569.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80917.log 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.42226.log.gz 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69749.log.gz 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53159.log 2026-03-24T12:09:13.085 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33512.log 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80917.log: /var/log/ceph/ceph-client.admin.53159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80917.log.gz 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36113.log 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53159.log.gz 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33512.log.gz 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34572.log 2026-03-24T12:09:13.086 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91194.log 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36113.log.gz 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34572.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.34572.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.44175.log 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91194.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35736.log 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91194.log.gz 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73370.log 2026-03-24T12:09:13.087 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44175.log.gz 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60743.log 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35736.log.gz 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73370.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91878.log 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73370.log.gz 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67411.log 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60743.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60743.log.gz 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77590.log 2026-03-24T12:09:13.088 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91878.log.gz 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67411.log.gz 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32119.log 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30240.log 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77590.log.gz 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48484.log 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32119.log: /var/log/ceph/ceph-client.admin.30240.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30240.log.gz 2026-03-24T12:09:13.089 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67153.log 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59969.log 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr: 1.2%/var/log/ceph/ceph-client.admin.48484.log: -- replaced with /var/log/ceph/ceph-client.admin.32119.log.gz 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67153.log.gz 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73607.log 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.48484.log.gz 2026-03-24T12:09:13.090 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38276.log 2026-03-24T12:09:13.091 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31493.log 2026-03-24T12:09:13.091 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59969.log: /var/log/ceph/ceph-client.admin.73607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59969.log.gz 2026-03-24T12:09:13.091 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73607.log.gz 2026-03-24T12:09:13.091 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38276.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35152.log 2026-03-24T12:09:13.091 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.38276.log.gz 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31493.log.gz 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43614.log 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose/var/log/ceph/ceph-client.admin.35152.log: -- /var/log/ceph/ceph-client.admin.25716.log 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35152.log.gz 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43614.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60248.log 2026-03-24T12:09:13.092 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25716.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61366.log 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43614.log.gz 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25716.log.gz 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60248.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60248.log.gz 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73478.log 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59046.log 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61366.log.gz 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73478.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35974.log 2026-03-24T12:09:13.093 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73478.log.gz 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59046.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49889.log 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59046.log.gz 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55416.log 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35974.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35974.log.gz 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49889.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65937.log 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49889.log.gz 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55416.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36756.log 2026-03-24T12:09:13.094 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55416.log.gz 2026-03-24T12:09:13.095 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65937.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65937.log.gz 2026-03-24T12:09:13.095 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67346.log 2026-03-24T12:09:13.095 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75098.log 2026-03-24T12:09:13.095 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36756.log.gz 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67346.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65071.log 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67346.log.gz 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75098.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32290.log 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75098.log.gz 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80084.log 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65071.log.gz 2026-03-24T12:09:13.096 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63798.log 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32290.log.gz 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80084.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80084.log.gz 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49135.log 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63798.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29402.log 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63798.log.gz 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49135.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90725.log 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49135.log.gz 2026-03-24T12:09:13.097 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54770.log 2026-03-24T12:09:13.098 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29402.log.gz 2026-03-24T12:09:13.098 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90725.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37429.log 2026-03-24T12:09:13.098 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90725.log.gz 2026-03-24T12:09:13.098 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54770.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29188.log 2026-03-24T12:09:13.098 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.54770.log.gz 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42754.log 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37429.log: /var/log/ceph/ceph-client.admin.29188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37429.log.gz 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28973.log 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29188.log.gz 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42754.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42754.log.gz 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60549.log 2026-03-24T12:09:13.099 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68120.log 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28973.log.gz 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72787.log 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60549.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60549.log.gz 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68120.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68120.log.gz 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36215.log 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64167.log 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72787.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72787.log.gz 2026-03-24T12:09:13.100 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37961.log 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36215.log.gz 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64167.log.gz 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54357.log 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29681.log 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37961.log: /var/log/ceph/ceph-client.admin.54357.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91852.log 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54357.log.gz 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37961.log.gz 2026-03-24T12:09:13.101 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29681.log.gz 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46227.log 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74759.log 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91852.log.gz 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46227.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30090.log 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46227.log.gz 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74759.log.gz 2026-03-24T12:09:13.102 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53576.log 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58026.log 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30090.log.gz 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63420.log 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53576.log.gz 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58026.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58026.log.gz 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75954.log 2026-03-24T12:09:13.103 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75018.log 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63420.log.gz 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91265.log 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75954.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75954.log.gz 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75018.log.gz 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35232.log 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34392.log 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91265.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91265.log.gz 2026-03-24T12:09:13.104 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44454.log 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35232.log.gz 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34392.log.gz 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44092.log 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41040.log 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44454.log: /var/log/ceph/ceph-client.admin.44092.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49393.log 2026-03-24T12:09:13.105 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41040.log: 27.2% -- replaced with /var/log/ceph/ceph-client.admin.44454.log.gz 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41040.log.gz 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr: 59.0% -- replaced with /var/log/ceph/ceph-client.admin.44092.log.gz 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28994.log 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49393.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55868.log 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49393.log.gz 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28994.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91167.log 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28994.log.gz 2026-03-24T12:09:13.106 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75528.log 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55868.log: /var/log/ceph/ceph-client.admin.91167.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59264.log 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55868.log.gz 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.91167.log.gz 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75528.log.gz 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76105.log 2026-03-24T12:09:13.107 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45955.log 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59264.log.gz 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76105.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28865.log 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76105.log.gz 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45955.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79909.log 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45955.log.gz 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26168.log 2026-03-24T12:09:13.108 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28865.log.gz 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79587.log 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79909.log.gz 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26168.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91904.log 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26168.log.gz 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58926.log 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79587.log.gz 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25580.log 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91904.log.gz 2026-03-24T12:09:13.109 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58926.log.gz 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60420.log 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46456.log 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25580.log.gz 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67669.log 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60420.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60420.log.gz 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46456.log.gz 2026-03-24T12:09:13.110 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53030.log 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33195.log 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67669.log.gz 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83208.log 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53030.log.gz 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33195.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33195.log.gzgzip 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.45503.log 2026-03-24T12:09:13.111 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28598.log 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83208.log.gz 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34652.log 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45503.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45503.log.gz 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28598.log.gz 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78457.log 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32989.log 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34652.log.gz 2026-03-24T12:09:13.112 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43474.log 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78457.log.gz 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32989.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57713.log 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32989.log.gz 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81479.log 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31600.log 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57713.log: 0.0% 1.2% -- replaced with /var/log/ceph/ceph-client.admin.43474.log.gz 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.57713.log.gz/var/log/ceph/ceph-client.admin.81479.log: 2026-03-24T12:09:13.113 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81479.log.gz 2026-03-24T12:09:13.114 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32768.log 2026-03-24T12:09:13.114 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91395.log 2026-03-24T12:09:13.114 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31600.log.gz 2026-03-24T12:09:13.114 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32768.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63554.log 2026-03-24T12:09:13.114 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91395.log.gz 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32768.log.gz 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79994.log 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63554.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82435.log 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63554.log.gz 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79994.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91549.log 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79994.log.gz 2026-03-24T12:09:13.115 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52750.log 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82435.log.gz 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91549.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47722.log 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91549.log.gz 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52750.log.gz 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85567.log 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33332.log 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47722.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47722.log.gz 2026-03-24T12:09:13.116 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86145.log 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85567.log: 0.0%/var/log/ceph/ceph-client.admin.33332.log: -- replaced with /var/log/ceph/ceph-client.admin.85567.log.gz 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr: 1.2%gzip -- replaced with /var/log/ceph/ceph-client.admin.33332.log.gz -5 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.77418.log 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37303.log 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86145.log.gz 2026-03-24T12:09:13.117 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71782.log 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77418.log: /var/log/ceph/ceph-client.admin.37303.log: 26.5%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57306.log 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /var/log/ceph/ceph-client.admin.77418.log.gz 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37303.log.gz 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71782.log.gz 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38507.log 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83302.log 2026-03-24T12:09:13.118 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57306.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57306.log.gz 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38507.log.gz 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33972.log 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83302.log.gz 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86685.log 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62052.log 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33972.log.gz 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86685.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86685.log.gz -5 2026-03-24T12:09:13.119 INFO:teuthology.orchestra.run.vm05.stderr: --verbose -- /var/log/ceph/ceph-client.admin.57407.log 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39876.log 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62052.log.gz 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79733.log 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57407.log.gz/var/log/ceph/ceph-client.admin.39876.log: 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39876.log.gz 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91696.log 2026-03-24T12:09:13.120 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42567.log 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79733.log.gz 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69663.log 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91696.log.gz 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42567.log.gz 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80216.log 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58090.log 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69663.log.gz 2026-03-24T12:09:13.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80216.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28395.log 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80216.log.gz 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58090.log.gz 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65743.log 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85365.log 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28395.log.gz 2026-03-24T12:09:13.122 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51273.log 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65743.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65743.log.gz 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85365.log.gz 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49996.log 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78589.log 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51273.log.gz 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67110.log 2026-03-24T12:09:13.123 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49996.log: 0.0%/var/log/ceph/ceph-client.admin.78589.log: -- replaced with /var/log/ceph/ceph-client.admin.49996.log.gz 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78589.log.gz 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65325.log 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42918.log 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67110.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59185.log 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65325.log: -- replaced with /var/log/ceph/ceph-client.admin.67110.log.gz 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42918.log.gz 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67712.log 2026-03-24T12:09:13.124 INFO:teuthology.orchestra.run.vm05.stderr: 57.7% -- replaced with /var/log/ceph/ceph-client.admin.65325.log.gz 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22842.log 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59185.log: /var/log/ceph/ceph-client.admin.67712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59185.log.gz 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67712.log.gz 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85303.log 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22842.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29145.log 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22842.log.gz 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85303.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71438.log 2026-03-24T12:09:13.125 INFO:teuthology.orchestra.run.vm05.stderr: 91.1% -- replaced with /var/log/ceph/ceph-client.admin.85303.log.gz 2026-03-24T12:09:13.126 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29145.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29145.log.gz 2026-03-24T12:09:13.126 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31235.log 2026-03-24T12:09:13.126 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36841.log 2026-03-24T12:09:13.126 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71438.log: /var/log/ceph/ceph-client.admin.31235.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71438.log.gz 2026-03-24T12:09:13.126 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31235.log.gz 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61647.log 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36841.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73835.log 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36841.log.gz 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61647.log.gz 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32614.log 2026-03-24T12:09:13.127 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54154.log 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73835.log.gz 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42345.log 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32614.log: /var/log/ceph/ceph-client.admin.54154.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54154.log.gz 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32614.log.gz 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89714.log 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39064.log 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42345.log: /var/log/ceph/ceph-client.admin.89714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89714.log.gz 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86835.log 2026-03-24T12:09:13.128 INFO:teuthology.orchestra.run.vm05.stderr: 57.9% -- replaced with /var/log/ceph/ceph-client.admin.42345.log.gz 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39064.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54175.log 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86835.log.gz 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.39064.log.gz 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69319.log 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54175.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55996.log 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54175.log.gz 2026-03-24T12:09:13.129 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69319.log.gz 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63716.log 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35957.log 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55996.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43850.log 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63716.log: 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63716.log.gz 2026-03-24T12:09:13.130 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35957.log.gz 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35393.log 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.1.log 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43850.log: /var/log/ceph/ceph-client.admin.35393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35393.log.gz 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43672.log 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr: 25.3% -- replaced with /var/log/ceph/ceph-client.admin.43850.log.gz 2026-03-24T12:09:13.131 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46760.log 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43672.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.43672.log.gz --verbose 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr: -- /var/log/ceph/ceph-client.admin.56119.log 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53416.log 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46760.log: /var/log/ceph/ceph-client.admin.56119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46760.log.gz 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56119.log.gz 2026-03-24T12:09:13.132 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72103.log 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37618.log 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53416.log: /var/log/ceph/ceph-client.admin.72103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53416.log.gz 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72103.log.gz 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56161.log 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39683.log 2026-03-24T12:09:13.133 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37618.log: /var/log/ceph/ceph-client.admin.56161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56161.log.gz 2026-03-24T12:09:13.134 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37618.log.gz 2026-03-24T12:09:13.134 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32188.log 2026-03-24T12:09:13.134 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43333.log 2026-03-24T12:09:13.134 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39683.log: /var/log/ceph/ceph-client.admin.32188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39683.log.gz 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32188.log.gzgzip 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.77032.log 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43333.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40658.log 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.43333.log.gz 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76470.log 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40658.log: /var/log/ceph/ceph-client.admin.77032.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40658.log.gz 2026-03-24T12:09:13.135 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77032.log.gz 2026-03-24T12:09:13.136 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32955.log 2026-03-24T12:09:13.136 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67626.log 2026-03-24T12:09:13.136 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76470.log: /var/log/ceph/ceph-client.admin.32955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76470.log.gz 2026-03-24T12:09:13.136 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41298.log 2026-03-24T12:09:13.136 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32955.log.gz 2026-03-24T12:09:13.137 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67626.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67626.log.gz 2026-03-24T12:09:13.137 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83872.log 2026-03-24T12:09:13.137 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72227.log 2026-03-24T12:09:13.137 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41298.log: /var/log/ceph/ceph-client.admin.83872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41298.log.gz 2026-03-24T12:09:13.137 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83872.log.gz 2026-03-24T12:09:13.138 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59808.log 2026-03-24T12:09:13.138 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60872.log 2026-03-24T12:09:13.138 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72227.log: /var/log/ceph/ceph-client.admin.59808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72227.log.gz 2026-03-24T12:09:13.138 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59808.log.gz 2026-03-24T12:09:13.138 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35012.log 2026-03-24T12:09:13.139 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35753.log 2026-03-24T12:09:13.139 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60872.log.gz 2026-03-24T12:09:13.139 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35012.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91371.log 2026-03-24T12:09:13.139 INFO:teuthology.orchestra.run.vm05.stderr: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.35012.log.gz 2026-03-24T12:09:13.139 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35753.log.gz 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31042.log 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42445.log 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91371.log.gz 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31042.log.gz 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59580.log 2026-03-24T12:09:13.140 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33472.log 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42445.log: /var/log/ceph/ceph-client.admin.59580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59580.log.gz 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr: 15.7% -- replaced with /var/log/ceph/ceph-client.admin.42445.log.gz 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33732.log 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76709.log 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33472.log.gz 2026-03-24T12:09:13.141 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33732.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33732.log.gz 2026-03-24T12:09:13.142 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40220.log 2026-03-24T12:09:13.142 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38402.log 2026-03-24T12:09:13.142 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76709.log.gz 2026-03-24T12:09:13.142 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40220.log.gz 2026-03-24T12:09:13.142 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47599.log 2026-03-24T12:09:13.143 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38402.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63916.log 2026-03-24T12:09:13.143 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47599.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38402.log.gz 2026-03-24T12:09:13.143 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47599.log.gz 2026-03-24T12:09:13.143 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90918.log 2026-03-24T12:09:13.143 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59496.log 2026-03-24T12:09:13.144 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63916.log: /var/log/ceph/ceph-client.admin.90918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63916.log.gz 2026-03-24T12:09:13.144 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90918.log.gz 2026-03-24T12:09:13.144 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38423.log 2026-03-24T12:09:13.144 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78753.log 2026-03-24T12:09:13.144 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59496.log: /var/log/ceph/ceph-client.admin.38423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59496.log.gz 2026-03-24T12:09:13.145 INFO:teuthology.orchestra.run.vm05.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.38423.log.gz 2026-03-24T12:09:13.145 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31149.log 2026-03-24T12:09:13.145 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78753.log.gz 2026-03-24T12:09:13.145 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49459.log 2026-03-24T12:09:13.145 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45675.log 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31149.log: /var/log/ceph/ceph-client.admin.49459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31149.log.gz 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49459.log.gz 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65423.log 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90058.log 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45675.log: /var/log/ceph/ceph-client.admin.65423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45675.log.gz 2026-03-24T12:09:13.146 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65423.log.gz 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81264.log 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79413.log 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90058.log: /var/log/ceph/ceph-client.admin.81264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90058.log.gz 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81264.log.gz 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69298.log 2026-03-24T12:09:13.147 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55266.log 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79413.log: /var/log/ceph/ceph-client.admin.69298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79413.log.gz 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69298.log.gz 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43106.log 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84194.log 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55266.log: /var/log/ceph/ceph-client.admin.43106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55266.log.gz 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43106.log.gz 2026-03-24T12:09:13.148 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38381.log 2026-03-24T12:09:13.149 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57098.log 2026-03-24T12:09:13.149 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84194.log: /var/log/ceph/ceph-client.admin.38381.log: 54.6% -- replaced with /var/log/ceph/ceph-client.admin.84194.log.gz 2026-03-24T12:09:13.149 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38381.log.gz 2026-03-24T12:09:13.149 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69642.log 2026-03-24T12:09:13.149 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80401.log 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57098.log: /var/log/ceph/ceph-client.admin.69642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57098.log.gz 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69642.log.gz 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59908.log 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57369.log 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80401.log: /var/log/ceph/ceph-client.admin.59908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80401.log.gz 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59908.log.gz 2026-03-24T12:09:13.150 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38129.log 2026-03-24T12:09:13.151 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85304.log 2026-03-24T12:09:13.151 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57369.log: /var/log/ceph/ceph-client.admin.38129.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38129.log.gz 2026-03-24T12:09:13.151 INFO:teuthology.orchestra.run.vm05.stderr: 29.6% -- replaced with /var/log/ceph/ceph-client.admin.57369.log.gz 2026-03-24T12:09:13.151 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60119.log 2026-03-24T12:09:13.151 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42668.log 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85304.log: /var/log/ceph/ceph-client.admin.60119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60119.log.gz 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr: 90.2% -- replaced with /var/log/ceph/ceph-client.admin.85304.log.gz 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25949.log 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45371.log 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42668.log: /var/log/ceph/ceph-client.admin.25949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42668.log.gz 2026-03-24T12:09:13.152 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25949.log.gz 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53740.log 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57566.log 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45371.log: /var/log/ceph/ceph-client.admin.53740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45371.log.gz 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53740.log.gz 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58499.log 2026-03-24T12:09:13.153 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41926.log 2026-03-24T12:09:13.154 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57566.log: /var/log/ceph/ceph-client.admin.58499.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57566.log.gz 2026-03-24T12:09:13.154 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58499.log.gz 2026-03-24T12:09:13.154 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50125.log 2026-03-24T12:09:13.154 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56099.log 2026-03-24T12:09:13.155 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41926.log: /var/log/ceph/ceph-client.admin.50125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50125.log.gz 2026-03-24T12:09:13.155 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41926.log.gz 2026-03-24T12:09:13.155 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91930.log 2026-03-24T12:09:13.155 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45912.log 2026-03-24T12:09:13.156 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56099.log: /var/log/ceph/ceph-client.admin.91930.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56099.log.gz 2026-03-24T12:09:13.156 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91930.log.gz 2026-03-24T12:09:13.156 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77705.log 2026-03-24T12:09:13.156 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67647.log 2026-03-24T12:09:13.157 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77705.log: /var/log/ceph/ceph-client.admin.45912.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77705.log.gz 2026-03-24T12:09:13.157 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45912.log.gz 2026-03-24T12:09:13.157 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29961.log 2026-03-24T12:09:13.157 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77471.log 2026-03-24T12:09:13.157 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67647.log.gz 2026-03-24T12:09:13.158 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29961.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29961.log.gz 2026-03-24T12:09:13.158 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36569.log 2026-03-24T12:09:13.158 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71953.log 2026-03-24T12:09:13.158 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77471.log: /var/log/ceph/ceph-client.admin.36569.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77471.log.gz 2026-03-24T12:09:13.158 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36569.log.gz 2026-03-24T12:09:13.159 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82367.log 2026-03-24T12:09:13.159 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27900.log 2026-03-24T12:09:13.159 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71953.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71953.log.gz 2026-03-24T12:09:13.159 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82367.log.gz 2026-03-24T12:09:13.159 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77204.log 2026-03-24T12:09:13.160 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63187.log 2026-03-24T12:09:13.160 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27900.log.gz 2026-03-24T12:09:13.160 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77204.log.gz 2026-03-24T12:09:13.160 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63401.log 2026-03-24T12:09:13.161 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62808.log 2026-03-24T12:09:13.161 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63187.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63187.log.gz 2026-03-24T12:09:13.161 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63401.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63401.log.gz 2026-03-24T12:09:13.161 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28373.log 2026-03-24T12:09:13.161 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63005.log 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62808.log.gz 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28373.log.gz 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57269.log 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61216.log 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63005.log.gz 2026-03-24T12:09:13.162 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57269.log.gz 2026-03-24T12:09:13.163 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91722.log 2026-03-24T12:09:13.163 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76967.log 2026-03-24T12:09:13.163 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61216.log.gz 2026-03-24T12:09:13.163 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91722.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91722.log.gz 2026-03-24T12:09:13.163 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46174.log 2026-03-24T12:09:13.164 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47426.log 2026-03-24T12:09:13.164 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76967.log.gz 2026-03-24T12:09:13.164 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46174.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46174.log.gz 2026-03-24T12:09:13.164 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67884.log 2026-03-24T12:09:13.164 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71567.log 2026-03-24T12:09:13.165 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47426.log.gz 2026-03-24T12:09:13.165 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67884.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67884.log.gz 2026-03-24T12:09:13.165 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39085.log 2026-03-24T12:09:13.165 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71696.log 2026-03-24T12:09:13.165 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71567.log.gz 2026-03-24T12:09:13.166 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39085.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39085.log.gz 2026-03-24T12:09:13.166 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39148.log 2026-03-24T12:09:13.166 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59665.log 2026-03-24T12:09:13.166 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71696.log.gz 2026-03-24T12:09:13.167 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50292.log 2026-03-24T12:09:13.167 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.39148.log.gz 2026-03-24T12:09:13.167 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59665.log.gz 2026-03-24T12:09:13.167 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52496.log 2026-03-24T12:09:13.168 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29853.log 2026-03-24T12:09:13.168 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50292.log.gz 2026-03-24T12:09:13.168 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52496.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31983.log 2026-03-24T12:09:13.168 INFO:teuthology.orchestra.run.vm05.stderr: 58.8% -- replaced with /var/log/ceph/ceph-client.admin.52496.log.gz 2026-03-24T12:09:13.168 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29853.log.gz 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35212.log 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31864.log 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31983.log: /var/log/ceph/ceph-client.admin.35212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35212.log.gz 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81651.log 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31983.log.gz 2026-03-24T12:09:13.169 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90639.log 2026-03-24T12:09:13.170 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81651.log.gz 2026-03-24T12:09:13.170 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31864.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31864.log.gz 2026-03-24T12:09:13.170 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54090.log 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50542.log 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90639.log.gz 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54090.log.gz 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43023.log 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57506.log 2026-03-24T12:09:13.171 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50542.log.gz 2026-03-24T12:09:13.172 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43023.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64344.log 2026-03-24T12:09:13.172 INFO:teuthology.orchestra.run.vm05.stderr: 53.7% -- replaced with /var/log/ceph/ceph-client.admin.43023.log.gz 2026-03-24T12:09:13.172 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57506.log.gz 2026-03-24T12:09:13.172 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32870.log 2026-03-24T12:09:13.173 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88415.log 2026-03-24T12:09:13.173 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64344.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64344.log.gz 2026-03-24T12:09:13.173 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32870.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32870.log.gz 2026-03-24T12:09:13.173 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90424.log 2026-03-24T12:09:13.173 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65267.log 2026-03-24T12:09:13.174 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88415.log.gz 2026-03-24T12:09:13.174 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90424.log.gz 2026-03-24T12:09:13.174 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49264.log 2026-03-24T12:09:13.174 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30671.log 2026-03-24T12:09:13.175 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65267.log: /var/log/ceph/ceph-client.admin.49264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49264.log.gz 2026-03-24T12:09:13.175 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65267.log.gz 2026-03-24T12:09:13.175 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82693.log 2026-03-24T12:09:13.175 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82456.log 2026-03-24T12:09:13.175 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30671.log.gz 2026-03-24T12:09:13.176 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82693.log.gz 2026-03-24T12:09:13.176 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33572.log 2026-03-24T12:09:13.176 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56990.log 2026-03-24T12:09:13.176 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82456.log.gz 2026-03-24T12:09:13.176 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33572.log.gz 2026-03-24T12:09:13.177 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56969.log 2026-03-24T12:09:13.177 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91085.log 2026-03-24T12:09:13.177 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56990.log.gz 2026-03-24T12:09:13.177 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56969.log.gz 2026-03-24T12:09:13.177 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59889.log 2026-03-24T12:09:13.178 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36875.log 2026-03-24T12:09:13.178 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91085.log: 59.5% -- replaced with /var/log/ceph/ceph-client.admin.91085.log.gz 2026-03-24T12:09:13.178 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59889.log.gz 2026-03-24T12:09:13.178 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61628.log 2026-03-24T12:09:13.178 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42462.log 2026-03-24T12:09:13.179 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36875.log.gz 2026-03-24T12:09:13.179 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61628.log.gz 2026-03-24T12:09:13.179 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86792.log 2026-03-24T12:09:13.179 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48749.log 2026-03-24T12:09:13.179 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42462.log: /var/log/ceph/ceph-client.admin.86792.log: 56.1% -- replaced with /var/log/ceph/ceph-client.admin.42462.log.gz 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86792.log.gz 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90101.log 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57855.log 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48749.log.gz 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90101.log.gz 2026-03-24T12:09:13.180 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76126.log 2026-03-24T12:09:13.181 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68558.log 2026-03-24T12:09:13.181 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57855.log: /var/log/ceph/ceph-client.admin.76126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76126.log.gz 2026-03-24T12:09:13.181 INFO:teuthology.orchestra.run.vm05.stderr: 25.4% -- replaced with /var/log/ceph/ceph-client.admin.57855.log.gz 2026-03-24T12:09:13.181 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68894.log 2026-03-24T12:09:13.182 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77748.log 2026-03-24T12:09:13.182 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68558.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68558.log.gz 2026-03-24T12:09:13.182 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68894.log.gz 2026-03-24T12:09:13.182 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50993.log 2026-03-24T12:09:13.183 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61535.log 2026-03-24T12:09:13.183 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77748.log.gz 2026-03-24T12:09:13.183 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50993.log.gz 2026-03-24T12:09:13.183 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72548.log 2026-03-24T12:09:13.183 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37471.log 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61535.log.gz 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72548.log.gz 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48985.log 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27729.log 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37471.log: /var/log/ceph/ceph-client.admin.48985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48985.log.gz 2026-03-24T12:09:13.184 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.37471.log.gz 2026-03-24T12:09:13.185 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74847.log 2026-03-24T12:09:13.185 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53884.log 2026-03-24T12:09:13.185 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27729.log.gz 2026-03-24T12:09:13.185 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74847.log.gz 2026-03-24T12:09:13.186 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77435.log 2026-03-24T12:09:13.186 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83764.log 2026-03-24T12:09:13.186 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53884.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53884.log.gz 2026-03-24T12:09:13.186 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77435.log.gz 2026-03-24T12:09:13.186 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60012.log 2026-03-24T12:09:13.187 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59745.log 2026-03-24T12:09:13.187 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83764.log.gz 2026-03-24T12:09:13.187 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60012.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60012.log.gz 2026-03-24T12:09:13.187 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27054.log 2026-03-24T12:09:13.187 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33315.log 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59745.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59745.log.gz 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27054.log.gz 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90983.log 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69470.log 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33315.log: /var/log/ceph/ceph-client.admin.90983.log: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.33315.log.gz 2026-03-24T12:09:13.188 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90983.log.gz 2026-03-24T12:09:13.189 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83515.log 2026-03-24T12:09:13.189 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90402.log 2026-03-24T12:09:13.189 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69470.log.gz 2026-03-24T12:09:13.189 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83515.log.gz 2026-03-24T12:09:13.189 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82650.log 2026-03-24T12:09:13.190 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82392.log 2026-03-24T12:09:13.190 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90402.log.gz 2026-03-24T12:09:13.190 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82650.log.gz 2026-03-24T12:09:13.191 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72507.log 2026-03-24T12:09:13.191 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30613.log 2026-03-24T12:09:13.191 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82392.log.gz 2026-03-24T12:09:13.191 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72507.log: 44.3% -- replaced with /var/log/ceph/ceph-client.admin.72507.log.gz 2026-03-24T12:09:13.191 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53376.log 2026-03-24T12:09:13.192 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66396.log 2026-03-24T12:09:13.192 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30613.log: /var/log/ceph/ceph-client.admin.53376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30613.log.gz 2026-03-24T12:09:13.192 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53376.log.gz 2026-03-24T12:09:13.192 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85887.log 2026-03-24T12:09:13.192 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35352.log 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66396.log: /var/log/ceph/ceph-client.admin.85887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66396.log.gz 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85887.log.gz 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66704.log 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35352.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86986.log 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35352.log.gz 2026-03-24T12:09:13.193 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72864.log 2026-03-24T12:09:13.194 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66704.log.gz 2026-03-24T12:09:13.194 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86986.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86986.log.gz 2026-03-24T12:09:13.194 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62985.log 2026-03-24T12:09:13.194 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52557.log 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72864.log.gz 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62985.log.gz 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74511.log 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34772.log 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52557.log.gz 2026-03-24T12:09:13.195 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74511.log.gz 2026-03-24T12:09:13.196 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50248.log 2026-03-24T12:09:13.196 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43929.log 2026-03-24T12:09:13.196 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34772.log: /var/log/ceph/ceph-client.admin.50248.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50248.log.gz 2026-03-24T12:09:13.196 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34772.log.gz 2026-03-24T12:09:13.196 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83336.log 2026-03-24T12:09:13.197 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28887.log 2026-03-24T12:09:13.197 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43929.log: /var/log/ceph/ceph-client.admin.83336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83336.log.gz 2026-03-24T12:09:13.197 INFO:teuthology.orchestra.run.vm05.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.43929.log.gz 2026-03-24T12:09:13.197 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66334.log 2026-03-24T12:09:13.198 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79822.log 2026-03-24T12:09:13.198 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28887.log.gz 2026-03-24T12:09:13.198 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66334.log: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.66334.log.gz 2026-03-24T12:09:13.198 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65915.log 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38549.log 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79822.log.gz 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65915.log.gz 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74303.log 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65765.log 2026-03-24T12:09:13.199 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38549.log: /var/log/ceph/ceph-client.admin.74303.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38549.log.gz 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74303.log.gz 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30772.log 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46404.log 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65765.log: /var/log/ceph/ceph-client.admin.30772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65765.log.gz 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30772.log.gz 2026-03-24T12:09:13.200 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58671.log 2026-03-24T12:09:13.201 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65190.log 2026-03-24T12:09:13.201 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46404.log: /var/log/ceph/ceph-client.admin.58671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46404.log.gz 2026-03-24T12:09:13.201 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58671.log.gz 2026-03-24T12:09:13.201 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45783.log 2026-03-24T12:09:13.201 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46584.log 2026-03-24T12:09:13.202 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65190.log: /var/log/ceph/ceph-client.admin.45783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45783.log.gz 2026-03-24T12:09:13.202 INFO:teuthology.orchestra.run.vm05.stderr: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.65190.log.gz 2026-03-24T12:09:13.202 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90617.log 2026-03-24T12:09:13.202 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53539.log 2026-03-24T12:09:13.202 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46584.log.gz 2026-03-24T12:09:13.203 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90617.log.gz 2026-03-24T12:09:13.203 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48921.log 2026-03-24T12:09:13.203 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91748.log 2026-03-24T12:09:13.203 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53539.log: /var/log/ceph/ceph-client.admin.48921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48921.log.gz 2026-03-24T12:09:13.203 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53539.log.gz 2026-03-24T12:09:13.204 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81780.log 2026-03-24T12:09:13.204 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58649.log 2026-03-24T12:09:13.204 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91748.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91748.log.gz 2026-03-24T12:09:13.204 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81780.log.gz 2026-03-24T12:09:13.204 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59301.log 2026-03-24T12:09:13.205 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39169.log 2026-03-24T12:09:13.205 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58649.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58649.log.gz 2026-03-24T12:09:13.205 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59301.log.gz 2026-03-24T12:09:13.205 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30848.log 2026-03-24T12:09:13.205 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75358.log 2026-03-24T12:09:13.206 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39169.log: /var/log/ceph/ceph-client.admin.30848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30848.log.gz 2026-03-24T12:09:13.206 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.39169.log.gz 2026-03-24T12:09:13.206 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47659.log 2026-03-24T12:09:13.206 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34672.log 2026-03-24T12:09:13.207 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75358.log: /var/log/ceph/ceph-client.admin.47659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75358.log.gz 2026-03-24T12:09:13.207 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47659.log.gz 2026-03-24T12:09:13.207 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28072.log 2026-03-24T12:09:13.207 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33023.log 2026-03-24T12:09:13.207 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34672.log.gz 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28072.log.gz 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62010.log 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38444.log 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33023.log: /var/log/ceph/ceph-client.admin.62010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62010.log.gz 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33023.log.gz 2026-03-24T12:09:13.208 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78804.log 2026-03-24T12:09:13.209 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54878.log 2026-03-24T12:09:13.209 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38444.log: /var/log/ceph/ceph-client.admin.78804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78804.log.gz 2026-03-24T12:09:13.209 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38444.log.gz 2026-03-24T12:09:13.209 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78371.log 2026-03-24T12:09:13.209 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54813.log 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54878.log.gz 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78371.log.gz 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73220.log 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29488.log 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54813.log: /var/log/ceph/ceph-client.admin.73220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73220.log.gz 2026-03-24T12:09:13.210 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.54813.log.gz 2026-03-24T12:09:13.211 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64541.log 2026-03-24T12:09:13.211 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28951.log 2026-03-24T12:09:13.211 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29488.log.gz 2026-03-24T12:09:13.211 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64541.log.gz 2026-03-24T12:09:13.211 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50039.log 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75997.log 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28951.log.gz 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50039.log.gz 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61687.log 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36960.log 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75997.log.gz 2026-03-24T12:09:13.212 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61687.log.gz 2026-03-24T12:09:13.213 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87200.log 2026-03-24T12:09:13.213 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.tmp-client.admin.19122.log 2026-03-24T12:09:13.213 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36960.log: /var/log/ceph/ceph-client.admin.87200.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36960.log.gz 2026-03-24T12:09:13.213 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87200.log.gz 2026-03-24T12:09:13.213 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43293.log 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72040.log 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph.tmp-client.admin.19122.log: /var/log/ceph/ceph-client.admin.43293.log: 0.0% -- replaced with /var/log/ceph/ceph.tmp-client.admin.19122.log.gz 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43293.log.gz 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74213.log 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55028.log 2026-03-24T12:09:13.214 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72040.log: /var/log/ceph/ceph-client.admin.74213.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72040.log.gz 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74213.log.gz 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62169.log 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39833.log 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55028.log: /var/log/ceph/ceph-client.admin.62169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55028.log.gz 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62169.log.gz 2026-03-24T12:09:13.215 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48857.log 2026-03-24T12:09:13.216 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82843.log 2026-03-24T12:09:13.216 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39833.log: /var/log/ceph/ceph-client.admin.48857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39833.log.gz 2026-03-24T12:09:13.216 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48857.log.gz 2026-03-24T12:09:13.216 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65249.log 2026-03-24T12:09:13.216 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34932.log 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82843.log.gz 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65249.log.gz 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35787.log 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27130.log 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34932.log: /var/log/ceph/ceph-client.admin.35787.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34932.log.gz 2026-03-24T12:09:13.217 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35787.log.gz 2026-03-24T12:09:13.218 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35112.log 2026-03-24T12:09:13.218 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83614.log 2026-03-24T12:09:13.218 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27130.log: /var/log/ceph/ceph-client.admin.35112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27130.log.gz 2026-03-24T12:09:13.218 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35112.log.gz 2026-03-24T12:09:13.218 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47018.log 2026-03-24T12:09:13.219 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68727.log 2026-03-24T12:09:13.219 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83614.log: /var/log/ceph/ceph-client.admin.47018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47018.log.gz 2026-03-24T12:09:13.219 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83614.log.gz 2026-03-24T12:09:13.219 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54921.log 2026-03-24T12:09:13.220 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28438.log 2026-03-24T12:09:13.220 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68727.log: /var/log/ceph/ceph-client.admin.54921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68727.log.gz 2026-03-24T12:09:13.220 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54921.log.gz 2026-03-24T12:09:13.220 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88257.log 2026-03-24T12:09:13.220 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82671.log 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28438.log: /var/log/ceph/ceph-client.admin.88257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28438.log.gz 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88257.log.gz 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72826.log 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44705.log 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82671.log: /var/log/ceph/ceph-client.admin.72826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82671.log.gz 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.72826.log.gz 2026-03-24T12:09:13.221 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51230.log 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79926.log 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44705.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44705.log.gz 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51230.log.gz 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30369.log 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39489.log 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79926.log: /var/log/ceph/ceph-client.admin.30369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79926.log.gz 2026-03-24T12:09:13.222 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30369.log.gz 2026-03-24T12:09:13.223 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88128.log 2026-03-24T12:09:13.223 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36790.log 2026-03-24T12:09:13.223 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39489.log.gz 2026-03-24T12:09:13.223 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88128.log.gz 2026-03-24T12:09:13.223 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79977.log 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29918.log 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36790.log: /var/log/ceph/ceph-client.admin.79977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36790.log.gz 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79977.log.gz 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79875.log 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64910.log 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29918.log: /var/log/ceph/ceph-client.admin.79875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29918.log.gz 2026-03-24T12:09:13.224 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79875.log.gz 2026-03-24T12:09:13.225 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48214.log 2026-03-24T12:09:13.225 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65571.log 2026-03-24T12:09:13.225 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48214.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48214.log.gz 2026-03-24T12:09:13.226 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64910.log.gz 2026-03-24T12:09:13.226 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47742.log 2026-03-24T12:09:13.226 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57773.log 2026-03-24T12:09:13.227 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65571.log: /var/log/ceph/ceph-client.admin.47742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65571.log.gz 2026-03-24T12:09:13.227 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47742.log.gz 2026-03-24T12:09:13.227 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83829.log 2026-03-24T12:09:13.227 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28523.log 2026-03-24T12:09:13.228 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57773.log.gz 2026-03-24T12:09:13.228 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83829.log.gz 2026-03-24T12:09:13.228 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78943.log 2026-03-24T12:09:13.228 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46824.log 2026-03-24T12:09:13.228 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28523.log: /var/log/ceph/ceph-client.admin.78943.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.28523.log.gz 2026-03-24T12:09:13.229 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78943.log.gz 2026-03-24T12:09:13.229 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44291.log 2026-03-24T12:09:13.229 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78672.log 2026-03-24T12:09:13.229 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46824.log.gz 2026-03-24T12:09:13.229 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44291.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44291.log.gz 2026-03-24T12:09:13.230 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85124.log 2026-03-24T12:09:13.230 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58945.log 2026-03-24T12:09:13.230 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78672.log.gz 2026-03-24T12:09:13.230 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85124.log.gz 2026-03-24T12:09:13.230 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66087.log 2026-03-24T12:09:13.231 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68249.log 2026-03-24T12:09:13.231 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58945.log: /var/log/ceph/ceph-client.admin.66087.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58945.log.gz 2026-03-24T12:09:13.231 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66087.log.gz 2026-03-24T12:09:13.231 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40520.log 2026-03-24T12:09:13.231 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88021.log 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68249.log: /var/log/ceph/ceph-client.admin.40520.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68249.log.gz 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40520.log.gz 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56348.log 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71481.log 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88021.log: /var/log/ceph/ceph-client.admin.56348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88021.log.gz 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56348.log.gz 2026-03-24T12:09:13.232 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38717.log 2026-03-24T12:09:13.233 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63614.log 2026-03-24T12:09:13.233 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71481.log: /var/log/ceph/ceph-client.admin.38717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71481.log.gz 2026-03-24T12:09:13.233 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38717.log.gz 2026-03-24T12:09:13.233 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84909.log 2026-03-24T12:09:13.233 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44375.log 2026-03-24T12:09:13.234 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63614.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63614.log.gz 2026-03-24T12:09:13.234 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84909.log.gz 2026-03-24T12:09:13.234 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86964.log 2026-03-24T12:09:13.234 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44375.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62301.log 2026-03-24T12:09:13.235 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86964.log.gz 2026-03-24T12:09:13.235 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44375.log.gz 2026-03-24T12:09:13.235 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48728.log 2026-03-24T12:09:13.235 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40618.log 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62301.log: /var/log/ceph/ceph-client.admin.48728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62301.log.gz 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48728.log.gz 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52536.log 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65366.log 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40618.log.gz 2026-03-24T12:09:13.236 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52536.log.gz 2026-03-24T12:09:13.237 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42101.log 2026-03-24T12:09:13.237 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26490.log 2026-03-24T12:09:13.237 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65366.log: /var/log/ceph/ceph-client.admin.42101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65366.log.gz 2026-03-24T12:09:13.237 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.42101.log.gz 2026-03-24T12:09:13.237 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26211.log 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87243.log 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26490.log.gz 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26211.log.gz 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45632.log 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66811.log 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87243.log.gz 2026-03-24T12:09:13.238 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45632.log.gz 2026-03-24T12:09:13.239 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72285.log 2026-03-24T12:09:13.239 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83915.log 2026-03-24T12:09:13.239 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66811.log: /var/log/ceph/ceph-client.admin.72285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72285.log.gz 2026-03-24T12:09:13.239 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66811.log.gz 2026-03-24T12:09:13.239 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79375.log 2026-03-24T12:09:13.240 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44985.log 2026-03-24T12:09:13.240 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83915.log: /var/log/ceph/ceph-client.admin.79375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83915.log.gz 2026-03-24T12:09:13.240 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79375.log.gz 2026-03-24T12:09:13.240 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44963.log 2026-03-24T12:09:13.240 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44985.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60463.log 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44985.log.gz 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44963.log.gz 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67927.log 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67497.log 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60463.log: /var/log/ceph/ceph-client.admin.67927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60463.log.gz 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67927.log.gz 2026-03-24T12:09:13.241 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46932.log 2026-03-24T12:09:13.242 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33772.log 2026-03-24T12:09:13.242 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67497.log: /var/log/ceph/ceph-client.admin.46932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67497.log.gz 2026-03-24T12:09:13.242 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46932.log.gz 2026-03-24T12:09:13.242 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91334.log 2026-03-24T12:09:13.243 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40638.log 2026-03-24T12:09:13.243 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91334.log.gz 2026-03-24T12:09:13.243 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33772.log.gz 2026-03-24T12:09:13.243 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42325.log 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78199.log 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40638.log: /var/log/ceph/ceph-client.admin.42325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40638.log.gz 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.42325.log.gz 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37982.log 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84780.log 2026-03-24T12:09:13.244 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78199.log: /var/log/ceph/ceph-client.admin.37982.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.78199.log.gz 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37982.log.gz 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30827.log 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79287.log 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84780.log.gz 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30827.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30827.log.gz 2026-03-24T12:09:13.245 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74958.log 2026-03-24T12:09:13.246 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54568.log 2026-03-24T12:09:13.246 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79287.log: /var/log/ceph/ceph-client.admin.74958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79287.log.gz 2026-03-24T12:09:13.246 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74958.log.gz 2026-03-24T12:09:13.246 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52772.log 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45396.log 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54568.log: /var/log/ceph/ceph-client.admin.52772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54568.log.gz 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52772.log.gz 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52815.log 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36637.log 2026-03-24T12:09:13.247 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45396.log: /var/log/ceph/ceph-client.admin.52815.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45396.log.gz 2026-03-24T12:09:13.248 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52815.log.gz 2026-03-24T12:09:13.248 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59203.log 2026-03-24T12:09:13.248 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44271.log 2026-03-24T12:09:13.248 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36637.log: /var/log/ceph/ceph-client.admin.59203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36637.log.gz 2026-03-24T12:09:13.248 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59203.log.gz 2026-03-24T12:09:13.249 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88696.log 2026-03-24T12:09:13.249 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61022.log 2026-03-24T12:09:13.249 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44271.log: /var/log/ceph/ceph-client.admin.88696.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88696.log.gz 2026-03-24T12:09:13.249 INFO:teuthology.orchestra.run.vm05.stderr: 25.7% -- replaced with /var/log/ceph/ceph-client.admin.44271.log.gz 2026-03-24T12:09:13.249 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64033.log 2026-03-24T12:09:13.250 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58542.log 2026-03-24T12:09:13.250 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61022.log: /var/log/ceph/ceph-client.admin.64033.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61022.log.gz 2026-03-24T12:09:13.250 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64033.log.gz 2026-03-24T12:09:13.250 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58004.log 2026-03-24T12:09:13.250 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69051.log 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58004.log.gz 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58542.log.gz 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66644.log 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69051.log.gz 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49350.log 2026-03-24T12:09:13.251 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84323.log 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66644.log: /var/log/ceph/ceph-client.admin.49350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66644.log.gz 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49350.log.gz 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84737.log 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71825.log 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84323.log.gz 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84737.log.gz 2026-03-24T12:09:13.252 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75225.log 2026-03-24T12:09:13.253 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46280.log 2026-03-24T12:09:13.253 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71825.log: /var/log/ceph/ceph-client.admin.75225.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71825.log.gz 2026-03-24T12:09:13.253 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75225.log.gz 2026-03-24T12:09:13.253 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42733.log 2026-03-24T12:09:13.253 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34692.log 2026-03-24T12:09:13.254 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46280.log: /var/log/ceph/ceph-client.admin.42733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46280.log.gz 2026-03-24T12:09:13.254 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42733.log.gz 2026-03-24T12:09:13.254 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37534.log 2026-03-24T12:09:13.254 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66440.log 2026-03-24T12:09:13.254 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34692.log: /var/log/ceph/ceph-client.admin.37534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34692.log.gz 2026-03-24T12:09:13.255 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74978.log 2026-03-24T12:09:13.255 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37534.log.gz 2026-03-24T12:09:13.255 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66440.log.gz 2026-03-24T12:09:13.255 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80530.log 2026-03-24T12:09:13.255 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73306.log 2026-03-24T12:09:13.256 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74978.log: /var/log/ceph/ceph-client.admin.80530.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80530.log.gz 2026-03-24T12:09:13.256 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74978.log.gz 2026-03-24T12:09:13.256 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78908.log 2026-03-24T12:09:13.256 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34252.log 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73306.log: /var/log/ceph/ceph-client.admin.78908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78908.log.gz 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73306.log.gz 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37904.log 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70006.log 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34252.log: /var/log/ceph/ceph-client.admin.37904.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34252.log.gz 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.37904.log.gz 2026-03-24T12:09:13.257 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48561.log 2026-03-24T12:09:13.258 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61955.log 2026-03-24T12:09:13.258 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.70006.log: /var/log/ceph/ceph-client.admin.48561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48561.log.gz 2026-03-24T12:09:13.258 INFO:teuthology.orchestra.run.vm05.stderr: 55.1% -- replaced with /var/log/ceph/ceph-client.admin.70006.log.gz 2026-03-24T12:09:13.258 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66769.log 2026-03-24T12:09:13.259 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68625.log 2026-03-24T12:09:13.259 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61955.log: /var/log/ceph/ceph-client.admin.66769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61955.log.gz 2026-03-24T12:09:13.259 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66769.log.gz 2026-03-24T12:09:13.259 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86663.log 2026-03-24T12:09:13.260 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32102.log 2026-03-24T12:09:13.260 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68625.log: /var/log/ceph/ceph-client.admin.86663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68625.log.gz 2026-03-24T12:09:13.260 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86663.log.gz 2026-03-24T12:09:13.260 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64658.log 2026-03-24T12:09:13.260 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59705.log 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32102.log: /var/log/ceph/ceph-client.admin.64658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64658.log.gz 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32102.log.gz 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73542.log 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62343.log 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59705.log: /var/log/ceph/ceph-client.admin.73542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59705.log.gz 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73542.log.gz 2026-03-24T12:09:13.261 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45286.log 2026-03-24T12:09:13.262 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52836.log 2026-03-24T12:09:13.262 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62343.log: /var/log/ceph/ceph-client.admin.45286.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62343.log.gz 2026-03-24T12:09:13.262 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45286.log.gz 2026-03-24T12:09:13.262 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66502.log 2026-03-24T12:09:13.262 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75890.log 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52836.log.gz 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66502.log.gz 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87157.log 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50606.log 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75890.log: /var/log/ceph/ceph-client.admin.87157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75890.log.gz 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87157.log.gz 2026-03-24T12:09:13.263 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35838.log 2026-03-24T12:09:13.264 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43793.log 2026-03-24T12:09:13.264 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50606.log.gz 2026-03-24T12:09:13.264 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35838.log.gz 2026-03-24T12:09:13.264 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31578.log 2026-03-24T12:09:13.264 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49588.log 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43793.log: /var/log/ceph/ceph-client.admin.31578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43793.log.gz 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31578.log.gz 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57205.log 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68931.log 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49588.log.gz 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57205.log.gz 2026-03-24T12:09:13.265 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71159.log 2026-03-24T12:09:13.266 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86427.log 2026-03-24T12:09:13.266 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68931.log: /var/log/ceph/ceph-client.admin.71159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68931.log.gz 2026-03-24T12:09:13.266 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71159.log.gz 2026-03-24T12:09:13.266 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35923.log 2026-03-24T12:09:13.266 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90467.log 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86427.log.gz 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35923.log.gz 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71545.log 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32444.log 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90467.log: /var/log/ceph/ceph-client.admin.71545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90467.log.gz 2026-03-24T12:09:13.267 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71545.log.gz 2026-03-24T12:09:13.268 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43871.log 2026-03-24T12:09:13.268 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78638.log 2026-03-24T12:09:13.268 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32444.log: /var/log/ceph/ceph-client.admin.43871.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43871.log.gz 2026-03-24T12:09:13.268 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32444.log.gz 2026-03-24T12:09:13.268 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35461.log 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56904.log 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78638.log: /var/log/ceph/ceph-client.admin.35461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35461.log.gz 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78638.log.gz 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79450.log 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61918.log 2026-03-24T12:09:13.269 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56904.log: /var/log/ceph/ceph-client.admin.79450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56904.log.gz 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79450.log.gz 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59064.log 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34892.log 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61918.log: /var/log/ceph/ceph-client.admin.59064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61918.log.gz 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59064.log.gz 2026-03-24T12:09:13.270 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89098.log 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48771.log 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34892.log: /var/log/ceph/ceph-client.admin.89098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34892.log.gz 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89098.log.gz 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76774.log 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46666.log 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48771.log: /var/log/ceph/ceph-client.admin.76774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48771.log.gz 2026-03-24T12:09:13.271 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76774.log.gz 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53925.log 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76903.log 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46666.log: /var/log/ceph/ceph-client.admin.53925.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46666.log.gz 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53925.log.gz 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68498.log 2026-03-24T12:09:13.272 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26727.log 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76903.log: /var/log/ceph/ceph-client.admin.68498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76903.log.gz 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68498.log.gz 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52901.log 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80444.log 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26727.log.gz 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52901.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52901.log.gz 2026-03-24T12:09:13.273 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62112.log 2026-03-24T12:09:13.274 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28481.log 2026-03-24T12:09:13.274 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80444.log: /var/log/ceph/ceph-client.admin.62112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80444.log.gz 2026-03-24T12:09:13.274 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62112.log.gz 2026-03-24T12:09:13.274 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77268.log 2026-03-24T12:09:13.274 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79077.log 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77268.log.gz 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28481.log.gz 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61495.log 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68185.log 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79077.log.gz 2026-03-24T12:09:13.275 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61495.log.gz 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78049.log 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73349.log 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68185.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68185.log.gz 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78049.log.gz 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89951.log 2026-03-24T12:09:13.276 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34272.log 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73349.log.gz 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89951.log.gz 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64364.log 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56496.log 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34272.log: /var/log/ceph/ceph-client.admin.64364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34272.log.gz 2026-03-24T12:09:13.277 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64364.log.gz 2026-03-24T12:09:13.278 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87526.log 2026-03-24T12:09:13.278 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61516.log 2026-03-24T12:09:13.278 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56496.log: /var/log/ceph/ceph-client.admin.87526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56496.log.gz 2026-03-24T12:09:13.278 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87526.log.gz 2026-03-24T12:09:13.278 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76535.log 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61516.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63534.log 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61516.log.gz 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76535.log.gz 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86599.log 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73026.log 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63534.log: /var/log/ceph/ceph-client.admin.86599.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63534.log.gz 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86599.log.gz 2026-03-24T12:09:13.279 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38591.log 2026-03-24T12:09:13.280 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91419.log 2026-03-24T12:09:13.280 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73026.log: /var/log/ceph/ceph-client.admin.38591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73026.log.gz 2026-03-24T12:09:13.280 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.38591.log.gz 2026-03-24T12:09:13.280 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80638.log 2026-03-24T12:09:13.280 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40700.log 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91419.log: /var/log/ceph/ceph-client.admin.80638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91419.log.gz 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80638.log.gz 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46490.log 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65444.log 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40700.log: /var/log/ceph/ceph-client.admin.46490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40700.log.gz 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46490.log.gz 2026-03-24T12:09:13.281 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34952.log 2026-03-24T12:09:13.282 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90660.log 2026-03-24T12:09:13.282 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65444.log: /var/log/ceph/ceph-client.admin.34952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65444.log.gz 2026-03-24T12:09:13.282 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34952.log.gz 2026-03-24T12:09:13.282 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26039.log 2026-03-24T12:09:13.282 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87304.log 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90660.log: /var/log/ceph/ceph-client.admin.26039.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90660.log.gz 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26039.log.gz 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80122.log 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78690.log 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87304.log: /var/log/ceph/ceph-client.admin.80122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80122.log.gz 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87304.log.gz 2026-03-24T12:09:13.283 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75825.log 2026-03-24T12:09:13.284 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85994.log 2026-03-24T12:09:13.284 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78690.log: /var/log/ceph/ceph-client.admin.75825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78690.log.gz 2026-03-24T12:09:13.284 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75825.log.gz 2026-03-24T12:09:13.284 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76277.log 2026-03-24T12:09:13.284 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79054.log 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85994.log: /var/log/ceph/ceph-client.admin.76277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85994.log.gz 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76277.log.gz 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76946.log 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38486.log 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79054.log: /var/log/ceph/ceph-client.admin.76946.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79054.log.gz 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76946.log.gz 2026-03-24T12:09:13.285 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37198.log 2026-03-24T12:09:13.286 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30738.log 2026-03-24T12:09:13.286 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38486.log: /var/log/ceph/ceph-client.admin.37198.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.38486.log.gz 2026-03-24T12:09:13.286 INFO:teuthology.orchestra.run.vm05.stderr: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.37198.log.gz 2026-03-24T12:09:13.286 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72006.log 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36807.log 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72006.log: /var/log/ceph/ceph-client.admin.30738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72006.log.gz 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30738.log.gz 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75546.log 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48150.log 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36807.log: /var/log/ceph/ceph-client.admin.75546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36807.log.gz 2026-03-24T12:09:13.287 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75546.log.gz 2026-03-24T12:09:13.288 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87093.log 2026-03-24T12:09:13.288 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80272.log 2026-03-24T12:09:13.288 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48150.log: /var/log/ceph/ceph-client.admin.87093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48150.log.gz 2026-03-24T12:09:13.288 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87093.log.gz 2026-03-24T12:09:13.288 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37045.log 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47861.log 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80272.log: /var/log/ceph/ceph-client.admin.37045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80272.log.gz 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37045.log.gz 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91980.log 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84436.log 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47861.log.gz 2026-03-24T12:09:13.289 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91980.log.gz 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69427.log 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43552.log 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84436.log.gz 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69427.log.gz 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63169.log 2026-03-24T12:09:13.290 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41555.log 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43552.log: /var/log/ceph/ceph-client.admin.63169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63169.log.gz 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52643.log 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr: 55.5% -- replaced with /var/log/ceph/ceph-client.admin.43552.log.gz 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90359.log 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52643.log: /var/log/ceph/ceph-client.admin.41555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52643.log.gz 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41555.log.gz 2026-03-24T12:09:13.291 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63898.log 2026-03-24T12:09:13.292 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34332.log 2026-03-24T12:09:13.292 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90359.log: /var/log/ceph/ceph-client.admin.63898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90359.log.gz 2026-03-24T12:09:13.292 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63898.log.gz 2026-03-24T12:09:13.292 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69233.log 2026-03-24T12:09:13.292 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40320.log 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34332.log.gz 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69233.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69233.log.gz 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33872.log 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48442.log 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40320.log: /var/log/ceph/ceph-client.admin.33872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40320.log.gz 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33872.log.gz 2026-03-24T12:09:13.293 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91146.log 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67024.log 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48442.log: /var/log/ceph/ceph-client.admin.91146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91146.log.gz 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr: 55.1% -- replaced with /var/log/ceph/ceph-client.admin.48442.log.gz 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38738.log 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44942.log 2026-03-24T12:09:13.294 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67024.log.gz 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38738.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.38738.log.gz 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49006.log 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37113.log 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44942.log: /var/log/ceph/ceph-client.admin.49006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44942.log.gz 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49006.log.gz 2026-03-24T12:09:13.295 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81371.log 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45006.log 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37113.log: /var/log/ceph/ceph-client.admin.81371.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37113.log.gz 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81371.log.gz 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50436.log 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57983.log 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45006.log: /var/log/ceph/ceph-client.admin.50436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45006.log.gz 2026-03-24T12:09:13.296 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50436.log.gz 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59601.log 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86059.log 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57983.log.gz 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59601.log.gz 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81822.log 2026-03-24T12:09:13.297 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73392.log 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86059.log: /var/log/ceph/ceph-client.admin.81822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86059.log.gz 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81822.log.gz 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50312.log 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31085.log 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73392.log: /var/log/ceph/ceph-client.admin.50312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73392.log.gz 2026-03-24T12:09:13.298 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50312.log.gz 2026-03-24T12:09:13.299 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39232.log 2026-03-24T12:09:13.299 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27013.log 2026-03-24T12:09:13.299 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31085.log: /var/log/ceph/ceph-client.admin.39232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31085.log.gz 2026-03-24T12:09:13.299 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.39232.log.gz 2026-03-24T12:09:13.299 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43391.log 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36028.log 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27013.log: /var/log/ceph/ceph-client.admin.43391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27013.log.gz 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43391.log.gz 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65212.log 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22866.log 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36028.log: /var/log/ceph/ceph-client.admin.65212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36028.log.gz 2026-03-24T12:09:13.300 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65212.log.gz 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74231.log 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50206.log 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22866.log.gz 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74231.log.gz 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90273.log 2026-03-24T12:09:13.301 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68852.log 2026-03-24T12:09:13.302 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50206.log: /var/log/ceph/ceph-client.admin.90273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90273.log.gz 2026-03-24T12:09:13.302 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50206.log.gz 2026-03-24T12:09:13.302 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78655.log 2026-03-24T12:09:13.302 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63304.log 2026-03-24T12:09:13.302 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68852.log: /var/log/ceph/ceph-client.admin.78655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78655.log.gz 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.68852.log.gz 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63205.log 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44791.log 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63304.log: /var/log/ceph/ceph-client.admin.63205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63304.log.gz 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63205.log.gz 2026-03-24T12:09:13.303 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36317.log 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58477.log 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44791.log: /var/log/ceph/ceph-client.admin.36317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44791.log.gz 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36317.log.gz 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47799.log 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35583.log 2026-03-24T12:09:13.304 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58477.log.gz 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47799.log.gz 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31949.log 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40997.log 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35583.log: /var/log/ceph/ceph-client.admin.31949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35583.log.gz 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.31949.log.gz 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89757.log 2026-03-24T12:09:13.305 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49092.log 2026-03-24T12:09:13.306 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40997.log: /var/log/ceph/ceph-client.admin.89757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40997.log.gz 2026-03-24T12:09:13.306 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89757.log.gz 2026-03-24T12:09:13.306 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31300.log 2026-03-24T12:09:13.306 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61173.log 2026-03-24T12:09:13.307 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49092.log: /var/log/ceph/ceph-client.admin.31300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31300.log.gz 2026-03-24T12:09:13.307 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49092.log.gz 2026-03-24T12:09:13.307 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58628.log 2026-03-24T12:09:13.307 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46421.log 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61173.log: /var/log/ceph/ceph-client.admin.58628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58628.log.gz 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61173.log.gz 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85672.log 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31796.log 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46421.log: /var/log/ceph/ceph-client.admin.85672.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46421.log.gz -- replaced with /var/log/ceph/ceph-client.admin.85672.log.gz 2026-03-24T12:09:13.308 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37177.log 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89886.log 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31796.log: /var/log/ceph/ceph-client.admin.37177.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37177.log.gz 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31796.log.gz 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67432.log 2026-03-24T12:09:13.309 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83371.log 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89886.log: /var/log/ceph/ceph-client.admin.67432.log: 27.1% -- replaced with /var/log/ceph/ceph-client.admin.89886.log.gz 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67432.log.gz 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84044.log 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41615.log 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83371.log: /var/log/ceph/ceph-client.admin.84044.log: 84.4% -- replaced with /var/log/ceph/ceph-client.admin.83371.log.gz 2026-03-24T12:09:13.310 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84044.log.gz 2026-03-24T12:09:13.311 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62422.log 2026-03-24T12:09:13.311 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62828.log 2026-03-24T12:09:13.311 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41615.log: /var/log/ceph/ceph-client.admin.62422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62422.log.gz 2026-03-24T12:09:13.311 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41615.log.gz 2026-03-24T12:09:13.311 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26819.log 2026-03-24T12:09:13.312 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76795.log 2026-03-24T12:09:13.312 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62828.log: /var/log/ceph/ceph-client.admin.26819.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62828.log.gz 2026-03-24T12:09:13.312 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26819.log.gz 2026-03-24T12:09:13.312 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53700.log 2026-03-24T12:09:13.312 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36994.log 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76795.log: /var/log/ceph/ceph-client.admin.53700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76795.log.gz 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53700.log.gz 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43230.log 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75976.log 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36994.log: /var/log/ceph/ceph-client.admin.43230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36994.log.gz 2026-03-24T12:09:13.313 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43230.log.gz 2026-03-24T12:09:13.314 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76298.log 2026-03-24T12:09:13.314 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25528.log 2026-03-24T12:09:13.314 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75976.log.gz 2026-03-24T12:09:13.314 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76298.log.gz 2026-03-24T12:09:13.314 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74795.log 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74795.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74795.log.gz 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48173.log 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25528.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44641.log 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25528.log.gz 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48173.log.gz 2026-03-24T12:09:13.315 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33075.log 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76577.log 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44641.log: /var/log/ceph/ceph-client.admin.33075.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44641.log.gz 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33075.log.gz 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54589.log 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22702.log 2026-03-24T12:09:13.316 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76577.log: /var/log/ceph/ceph-client.admin.54589.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76577.log.gz 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.54589.log.gz 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75786.log 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32153.log 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.22702.log: /var/log/ceph/ceph-client.admin.75786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22702.log.gz 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75786.log.gz 2026-03-24T12:09:13.317 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41083.log 2026-03-24T12:09:13.318 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44920.log 2026-03-24T12:09:13.318 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32153.log: /var/log/ceph/ceph-client.admin.41083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41083.log.gz 2026-03-24T12:09:13.318 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32153.log.gz 2026-03-24T12:09:13.318 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46867.log 2026-03-24T12:09:13.318 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91570.log 2026-03-24T12:09:13.319 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44920.log.gz 2026-03-24T12:09:13.319 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42526.log 2026-03-24T12:09:13.319 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46867.log: /var/log/ceph/ceph-client.admin.91570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46867.log.gz 2026-03-24T12:09:13.319 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91570.log.gz 2026-03-24T12:09:13.319 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58966.log 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76838.log 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42526.log: /var/log/ceph/ceph-client.admin.58966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42526.log.gz 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58966.log.gz 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57876.log 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49049.log 2026-03-24T12:09:13.320 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76838.log.gz 2026-03-24T12:09:13.321 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57876.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83372.log 2026-03-24T12:09:13.321 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.57876.log.gz 2026-03-24T12:09:13.321 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49049.log.gz 2026-03-24T12:09:13.321 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34032.log 2026-03-24T12:09:13.321 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59418.log 2026-03-24T12:09:13.322 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83372.log: /var/log/ceph/ceph-client.admin.34032.log: 83.0% -- replaced with /var/log/ceph/ceph-client.admin.83372.log.gz 2026-03-24T12:09:13.322 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34032.log.gz 2026-03-24T12:09:13.322 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53051.log 2026-03-24T12:09:13.322 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33057.log 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59418.log: /var/log/ceph/ceph-client.admin.53051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53051.log.gz 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59418.log.gz 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61280.log 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58714.log 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33057.log: /var/log/ceph/ceph-client.admin.61280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61280.log.gz 2026-03-24T12:09:13.323 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.33057.log.gz 2026-03-24T12:09:13.324 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63695.log 2026-03-24T12:09:13.324 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55932.log 2026-03-24T12:09:13.324 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58714.log: /var/log/ceph/ceph-client.admin.63695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58714.log.gz 2026-03-24T12:09:13.324 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63695.log.gz 2026-03-24T12:09:13.324 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54718.log 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26297.log 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55932.log: /var/log/ceph/ceph-client.admin.54718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55932.log.gz 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr: 54.0% -- replaced with /var/log/ceph/ceph-client.admin.54718.log.gz 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74431.log 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78855.log 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26297.log: /var/log/ceph/ceph-client.admin.74431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26297.log.gz 2026-03-24T12:09:13.325 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74431.log.gz 2026-03-24T12:09:13.326 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56223.log 2026-03-24T12:09:13.326 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40846.log 2026-03-24T12:09:13.326 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78855.log.gz 2026-03-24T12:09:13.326 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56223.log.gz 2026-03-24T12:09:13.326 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72206.log 2026-03-24T12:09:13.327 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27603.log 2026-03-24T12:09:13.327 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40846.log: /var/log/ceph/ceph-client.admin.72206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40846.log.gz 2026-03-24T12:09:13.327 INFO:teuthology.orchestra.run.vm05.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.72206.log.gz 2026-03-24T12:09:13.327 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80045.log 2026-03-24T12:09:13.327 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27603.log.gz 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65285.log 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62551.log 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80045.log.gz 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65285.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65285.log.gz 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63067.log 2026-03-24T12:09:13.328 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88736.log 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62551.log: /var/log/ceph/ceph-client.admin.63067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62551.log.gz 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63067.log.gz 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65231.log 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68706.log 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88736.log: /var/log/ceph/ceph-client.admin.65231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88736.log.gz 2026-03-24T12:09:13.329 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65231.log.gz 2026-03-24T12:09:13.330 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84065.log 2026-03-24T12:09:13.330 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84586.log 2026-03-24T12:09:13.330 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68706.log: /var/log/ceph/ceph-client.admin.84065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68706.log.gz 2026-03-24T12:09:13.330 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84065.log.gz 2026-03-24T12:09:13.330 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87381.log 2026-03-24T12:09:13.331 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84586.log.gz 2026-03-24T12:09:13.331 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55846.log 2026-03-24T12:09:13.331 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87381.log.gz 2026-03-24T12:09:13.331 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47061.log 2026-03-24T12:09:13.331 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62500.log 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55846.log: /var/log/ceph/ceph-client.admin.47061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55846.log.gz 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47061.log.gz 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55395.log 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41427.log 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62500.log: /var/log/ceph/ceph-client.admin.55395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62500.log.gz 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55395.log.gz 2026-03-24T12:09:13.332 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48878.log 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30442.log 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41427.log: /var/log/ceph/ceph-client.admin.48878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41427.log.gz 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48878.log.gz 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37660.log 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79432.log 2026-03-24T12:09:13.333 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30442.log.gz 2026-03-24T12:09:13.334 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86470.log 2026-03-24T12:09:13.334 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.37660.log.gz 2026-03-24T12:09:13.334 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79432.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79432.log.gz 2026-03-24T12:09:13.334 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56077.log 2026-03-24T12:09:13.335 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86470.log.gz 2026-03-24T12:09:13.335 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63879.log 2026-03-24T12:09:13.335 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68768.log 2026-03-24T12:09:13.335 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56077.log: /var/log/ceph/ceph-client.admin.63879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56077.log.gz 2026-03-24T12:09:13.335 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63879.log.gz 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44662.log 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72962.log 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68768.log: /var/log/ceph/ceph-client.admin.44662.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44662.log.gz 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr: 25.2% -- replaced with /var/log/ceph/ceph-client.admin.68768.log.gz 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66482.log 2026-03-24T12:09:13.336 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86405.log 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72962.log: /var/log/ceph/ceph-client.admin.66482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72962.log.gz 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66482.log.gz 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80197.log 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44619.log 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86405.log: /var/log/ceph/ceph-client.admin.80197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86405.log.gz 2026-03-24T12:09:13.337 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80197.log.gz 2026-03-24T12:09:13.338 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36011.log 2026-03-24T12:09:13.338 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79214.log 2026-03-24T12:09:13.338 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44619.log: /var/log/ceph/ceph-client.admin.36011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44619.log.gz 2026-03-24T12:09:13.338 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36011.log.gz 2026-03-24T12:09:13.338 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26641.log 2026-03-24T12:09:13.339 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54134.log 2026-03-24T12:09:13.339 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79214.log: /var/log/ceph/ceph-client.admin.26641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79214.log.gz 2026-03-24T12:09:13.339 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26641.log.gz 2026-03-24T12:09:13.339 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25458.log 2026-03-24T12:09:13.339 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67239.log 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54134.log: /var/log/ceph/ceph-client.admin.25458.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54134.log.gz 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25458.log.gz 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33592.log 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29102.log 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67239.log: /var/log/ceph/ceph-client.admin.33592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67239.log.gz 2026-03-24T12:09:13.340 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33592.log.gz 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44112.log 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38980.log 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29102.log.gz 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44112.log: 26.1% -- replaced with /var/log/ceph/ceph-client.admin.44112.log.gz 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53864.log 2026-03-24T12:09:13.341 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56539.log 2026-03-24T12:09:13.342 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38980.log: /var/log/ceph/ceph-client.admin.53864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53864.log.gz 2026-03-24T12:09:13.342 INFO:teuthology.orchestra.run.vm05.stderr: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.38980.log.gz 2026-03-24T12:09:13.342 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31881.log 2026-03-24T12:09:13.342 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82585.log 2026-03-24T12:09:13.342 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56539.log: /var/log/ceph/ceph-client.admin.31881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56539.log.gz 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31881.log.gz 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58176.log 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63322.log 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82585.log: /var/log/ceph/ceph-client.admin.58176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82585.log.gz 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58176.log.gz 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69771.log 2026-03-24T12:09:13.343 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78135.log 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63322.log: /var/log/ceph/ceph-client.admin.69771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63322.log.gz 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69771.log.gz 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82201.log 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42284.log 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78135.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78135.log.gz 2026-03-24T12:09:13.344 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82201.log.gz 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74113.log 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81457.log 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42284.log.gz 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74113.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74113.log.gz 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36402.log 2026-03-24T12:09:13.345 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40146.log 2026-03-24T12:09:13.346 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81457.log: /var/log/ceph/ceph-client.admin.36402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81457.log.gz 2026-03-24T12:09:13.346 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36402.log.gz 2026-03-24T12:09:13.346 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79169.log 2026-03-24T12:09:13.346 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79123.log 2026-03-24T12:09:13.346 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40146.log.gz 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79169.log.gz 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77608.log 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52686.log 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79123.log.gz 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77608.log.gz 2026-03-24T12:09:13.347 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35994.log 2026-03-24T12:09:13.348 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66023.log 2026-03-24T12:09:13.348 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52686.log: /var/log/ceph/ceph-client.admin.35994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52686.log.gz 2026-03-24T12:09:13.348 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35994.log.gz 2026-03-24T12:09:13.348 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81049.log 2026-03-24T12:09:13.348 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28008.log 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66023.log: /var/log/ceph/ceph-client.admin.81049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66023.log.gz 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81049.log.gz 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59623.log 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57429.log 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28008.log.gz 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59623.log.gz 2026-03-24T12:09:13.349 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28639.log 2026-03-24T12:09:13.350 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62653.log 2026-03-24T12:09:13.350 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57429.log: /var/log/ceph/ceph-client.admin.28639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57429.log.gz 2026-03-24T12:09:13.350 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28639.log.gz 2026-03-24T12:09:13.350 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80939.log 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80939.log.gz 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67991.log 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62653.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87072.log 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67991.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.62653.log.gz 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67991.log.gz 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45998.log 2026-03-24T12:09:13.351 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39274.log 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87072.log: /var/log/ceph/ceph-client.admin.45998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87072.log.gz 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45998.log.gz 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67604.log 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86921.log 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39274.log: /var/log/ceph/ceph-client.admin.67604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67604.log.gz 2026-03-24T12:09:13.352 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.39274.log.gz 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73263.log 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73671.log 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86921.log.gz 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73263.log.gz 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44433.log 2026-03-24T12:09:13.353 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75261.log 2026-03-24T12:09:13.354 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73671.log.gz 2026-03-24T12:09:13.354 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44433.log.gz 2026-03-24T12:09:13.354 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33932.log 2026-03-24T12:09:13.354 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54377.log 2026-03-24T12:09:13.354 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75261.log: /var/log/ceph/ceph-client.admin.33932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75261.log.gz 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69685.log 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33932.log.gz 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54377.log.gz 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74471.log 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38192.log 2026-03-24T12:09:13.355 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69685.log.gz 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74471.log.gz 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37940.log 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58908.log 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38192.log: /var/log/ceph/ceph-client.admin.37940.log: 25.8% 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38192.log.gz -- replaced with /var/log/ceph/ceph-client.admin.37940.log.gz 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.356 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42164.log 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82499.log 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58908.log: /var/log/ceph/ceph-client.admin.42164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58908.log.gz 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42164.log.gz 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33912.log 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79192.log 2026-03-24T12:09:13.357 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82499.log: /var/log/ceph/ceph-client.admin.33912.log: 27.3% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82499.log.gz -- replaced with /var/log/ceph/ceph-client.admin.33912.log.gz 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38045.log 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79031.log 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79192.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79192.log.gz 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38045.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38045.log.gz 2026-03-24T12:09:13.358 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32170.log 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27470.log 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79031.log: /var/log/ceph/ceph-client.admin.32170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79031.log.gz 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32170.log.gz 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69491.log 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47147.log 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27470.log: /var/log/ceph/ceph-client.admin.69491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27470.log.gz 2026-03-24T12:09:13.359 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69491.log.gz 2026-03-24T12:09:13.360 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39640.log 2026-03-24T12:09:13.360 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56328.log 2026-03-24T12:09:13.360 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47147.log: /var/log/ceph/ceph-client.admin.39640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47147.log.gz 2026-03-24T12:09:13.360 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39640.log.gz 2026-03-24T12:09:13.360 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31847.log 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35566.log 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56328.log: /var/log/ceph/ceph-client.admin.31847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56328.log.gz 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31847.log.gz 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80315.log 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50714.log 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35566.log: /var/log/ceph/ceph-client.admin.80315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35566.log.gz 2026-03-24T12:09:13.361 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80315.log.gz 2026-03-24T12:09:13.362 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81992.log 2026-03-24T12:09:13.362 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41104.log 2026-03-24T12:09:13.362 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50714.log: /var/log/ceph/ceph-client.admin.81992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50714.log.gz 2026-03-24T12:09:13.362 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81992.log.gz 2026-03-24T12:09:13.362 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73752.log 2026-03-24T12:09:13.363 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49071.log 2026-03-24T12:09:13.363 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41104.log: /var/log/ceph/ceph-client.admin.73752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41104.log.gz 2026-03-24T12:09:13.363 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73752.log.gz 2026-03-24T12:09:13.363 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60055.log 2026-03-24T12:09:13.363 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37062.log 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49071.log: /var/log/ceph/ceph-client.admin.60055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49071.log.gz 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60055.log.gz 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47491.log 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51101.log 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37062.log.gz 2026-03-24T12:09:13.364 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47491.log.gz 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76341.log 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47539.log 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51101.log: /var/log/ceph/ceph-client.admin.76341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51101.log.gz 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76341.log.gz 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29875.log 2026-03-24T12:09:13.365 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64384.log 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47539.log: /var/log/ceph/ceph-client.admin.29875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29875.log.gz 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.47539.log.gz 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31106.log 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59438.log 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64384.log: /var/log/ceph/ceph-client.admin.31106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64384.log.gz 2026-03-24T12:09:13.366 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31106.log.gz 2026-03-24T12:09:13.367 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25639.log 2026-03-24T12:09:13.367 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83807.log 2026-03-24T12:09:13.367 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59438.log.gz 2026-03-24T12:09:13.367 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25639.log.gz 2026-03-24T12:09:13.367 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68034.log 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62848.log 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83807.log: /var/log/ceph/ceph-client.admin.68034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83807.log.gz 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68034.log.gz 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31557.log 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47984.log 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62848.log: /var/log/ceph/ceph-client.admin.31557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62848.log.gz 2026-03-24T12:09:13.368 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31557.log.gz 2026-03-24T12:09:13.369 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39382.log 2026-03-24T12:09:13.369 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61742.log 2026-03-24T12:09:13.369 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47984.log: /var/log/ceph/ceph-client.admin.39382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47984.log.gz 2026-03-24T12:09:13.369 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39382.log.gz 2026-03-24T12:09:13.369 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26103.log 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26404.log 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61742.log: /var/log/ceph/ceph-client.admin.26103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61742.log.gz 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26103.log.gz 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39360.log 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86814.log 2026-03-24T12:09:13.370 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26404.log.gz 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39360.log.gz 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36654.log 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28618.log 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86814.log: /var/log/ceph/ceph-client.admin.36654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86814.log.gz 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36654.log.gz 2026-03-24T12:09:13.371 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51058.log 2026-03-24T12:09:13.372 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82607.log 2026-03-24T12:09:13.372 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28618.log: /var/log/ceph/ceph-client.admin.51058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51058.log.gz 2026-03-24T12:09:13.372 INFO:teuthology.orchestra.run.vm05.stderr: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.28618.log.gz 2026-03-24T12:09:13.372 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79357.log 2026-03-24T12:09:13.372 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60958.log 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82607.log.gz 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79357.log.gz 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63857.log 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78285.log 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60958.log: /var/log/ceph/ceph-client.admin.63857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60958.log.gz 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr: 59.4% -- replaced with /var/log/ceph/ceph-client.admin.63857.log.gz 2026-03-24T12:09:13.373 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44493.log 2026-03-24T12:09:13.374 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78770.log 2026-03-24T12:09:13.374 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78285.log: /var/log/ceph/ceph-client.admin.44493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78285.log.gz 2026-03-24T12:09:13.374 INFO:teuthology.orchestra.run.vm05.stderr: 26.2% -- replaced with /var/log/ceph/ceph-client.admin.44493.log.gz 2026-03-24T12:09:13.374 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42122.log 2026-03-24T12:09:13.374 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26232.log 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78770.log: /var/log/ceph/ceph-client.admin.42122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78770.log.gz 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42122.log.gz 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57731.log 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29638.log 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26232.log: /var/log/ceph/ceph-client.admin.57731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26232.log.gz 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57731.log.gz 2026-03-24T12:09:13.375 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78092.log 2026-03-24T12:09:13.376 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41212.log 2026-03-24T12:09:13.376 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29638.log: /var/log/ceph/ceph-client.admin.78092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29638.log.gz 2026-03-24T12:09:13.376 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78092.log.gz 2026-03-24T12:09:13.376 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68456.log 2026-03-24T12:09:13.376 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38234.log 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41212.log: /var/log/ceph/ceph-client.admin.68456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41212.log.gz 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68456.log.gz 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60291.log 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62243.log 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38234.log: /var/log/ceph/ceph-client.admin.60291.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60291.log.gz 2026-03-24T12:09:13.377 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38234.log.gz 2026-03-24T12:09:13.378 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36824.log 2026-03-24T12:09:13.378 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33452.log 2026-03-24T12:09:13.378 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62243.log.gz 2026-03-24T12:09:13.378 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36824.log.gz 2026-03-24T12:09:13.378 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29059.log 2026-03-24T12:09:13.379 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55954.log 2026-03-24T12:09:13.379 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29059.log.gz 2026-03-24T12:09:13.379 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33452.log.gz 2026-03-24T12:09:13.379 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32205.log 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90230.log 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55954.log: /var/log/ceph/ceph-client.admin.32205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55954.log.gz 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32205.log.gz 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74723.log 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34112.log 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90230.log: /var/log/ceph/ceph-client.admin.74723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90230.log.gz 2026-03-24T12:09:13.380 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74723.log.gz 2026-03-24T12:09:13.381 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55825.log 2026-03-24T12:09:13.381 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37576.log 2026-03-24T12:09:13.381 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34112.log: /var/log/ceph/ceph-client.admin.55825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34112.log.gz 2026-03-24T12:09:13.381 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55825.log.gz 2026-03-24T12:09:13.381 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90144.log 2026-03-24T12:09:13.382 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40027.log 2026-03-24T12:09:13.382 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37576.log: /var/log/ceph/ceph-client.admin.90144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90144.log.gz 2026-03-24T12:09:13.382 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68665.log 2026-03-24T12:09:13.382 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.37576.log.gz 2026-03-24T12:09:13.382 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51015.log 2026-03-24T12:09:13.383 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68665.log.gz 2026-03-24T12:09:13.383 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40027.log.gz 2026-03-24T12:09:13.383 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69011.log 2026-03-24T12:09:13.383 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61989.log 2026-03-24T12:09:13.384 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69011.log: /var/log/ceph/ceph-client.admin.51015.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69011.log.gz 2026-03-24T12:09:13.384 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51015.log.gz 2026-03-24T12:09:13.384 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34812.log 2026-03-24T12:09:13.384 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54654.log 2026-03-24T12:09:13.384 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61989.log: /var/log/ceph/ceph-client.admin.34812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34812.log.gz 2026-03-24T12:09:13.385 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61989.log.gz 2026-03-24T12:09:13.385 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62280.log 2026-03-24T12:09:13.385 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43532.log 2026-03-24T12:09:13.385 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54654.log: /var/log/ceph/ceph-client.admin.62280.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62280.log.gz -- replaced with /var/log/ceph/ceph-client.admin.54654.log.gz 2026-03-24T12:09:13.385 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:13.386 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44254.log 2026-03-24T12:09:13.386 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66746.log 2026-03-24T12:09:13.386 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43532.log: /var/log/ceph/ceph-client.admin.44254.log: 56.0% -- replaced with /var/log/ceph/ceph-client.admin.43532.log.gz 2026-03-24T12:09:13.386 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44254.log.gz 2026-03-24T12:09:13.387 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34612.log 2026-03-24T12:09:13.387 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40740.log 2026-03-24T12:09:13.387 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66746.log.gz 2026-03-24T12:09:13.387 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34612.log.gz 2026-03-24T12:09:13.388 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71287.log 2026-03-24T12:09:13.388 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49867.log 2026-03-24T12:09:13.388 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40740.log: /var/log/ceph/ceph-client.admin.71287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40740.log.gz 2026-03-24T12:09:13.388 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71287.log.gz 2026-03-24T12:09:13.389 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37744.log 2026-03-24T12:09:13.389 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46473.log 2026-03-24T12:09:13.389 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49867.log: /var/log/ceph/ceph-client.admin.37744.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49867.log.gz 2026-03-24T12:09:13.389 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.37744.log.gz 2026-03-24T12:09:13.390 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74321.log 2026-03-24T12:09:13.390 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36368.log 2026-03-24T12:09:13.390 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46473.log: /var/log/ceph/ceph-client.admin.74321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46473.log.gz 2026-03-24T12:09:13.390 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74321.log.gz 2026-03-24T12:09:13.390 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87763.log 2026-03-24T12:09:13.391 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25996.log 2026-03-24T12:09:13.391 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36368.log: /var/log/ceph/ceph-client.admin.87763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36368.log.gz 2026-03-24T12:09:13.391 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87763.log.gz 2026-03-24T12:09:13.391 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82151.log 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34732.log 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25996.log.gz 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82151.log.gz 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61323.log 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28844.log 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34732.log: /var/log/ceph/ceph-client.admin.61323.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34732.log.gz 2026-03-24T12:09:13.392 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61323.log.gz 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44684.log 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63952.log 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28844.log: /var/log/ceph/ceph-client.admin.44684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28844.log.gz 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44684.log.gz 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33392.log 2026-03-24T12:09:13.393 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74741.log 2026-03-24T12:09:13.394 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63952.log.gz 2026-03-24T12:09:13.394 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33392.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33392.log.gz 2026-03-24T12:09:13.394 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82865.log 2026-03-24T12:09:13.394 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35512.log 2026-03-24T12:09:13.394 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74741.log: /var/log/ceph/ceph-client.admin.82865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74741.log.gz 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82865.log.gz 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87136.log 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50375.log 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35512.log: /var/log/ceph/ceph-client.admin.87136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35512.log.gz 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87136.log.gz 2026-03-24T12:09:13.395 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65528.log 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53436.log 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50375.log: /var/log/ceph/ceph-client.admin.65528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50375.log.gz 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65528.log.gz 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31830.log 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39317.log 2026-03-24T12:09:13.396 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53436.log: /var/log/ceph/ceph-client.admin.31830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53436.log.gz 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.31830.log.gz 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29982.log 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31020.log 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39317.log: /var/log/ceph/ceph-client.admin.29982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39317.log.gz 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29982.log.gz 2026-03-24T12:09:13.397 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32887.log 2026-03-24T12:09:13.398 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72062.log 2026-03-24T12:09:13.398 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31020.log: /var/log/ceph/ceph-client.admin.32887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31020.log.gz 2026-03-24T12:09:13.398 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32887.log.gz 2026-03-24T12:09:13.398 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54461.log 2026-03-24T12:09:13.398 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35617.log 2026-03-24T12:09:13.399 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72062.log.gz 2026-03-24T12:09:13.399 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54461.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81565.log 2026-03-24T12:09:13.399 INFO:teuthology.orchestra.run.vm05.stderr: 26.5% -- replaced with /var/log/ceph/ceph-client.admin.54461.log.gz 2026-03-24T12:09:13.399 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35617.log.gz 2026-03-24T12:09:13.399 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88304.log 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58585.log 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81565.log.gz 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88304.log.gz 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62322.log 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58585.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61237.log 2026-03-24T12:09:13.400 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58585.log.gz 2026-03-24T12:09:13.401 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62322.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82097.log 2026-03-24T12:09:13.401 INFO:teuthology.orchestra.run.vm05.stderr: 30.2% -- replaced with /var/log/ceph/ceph-client.admin.62322.log.gz 2026-03-24T12:09:13.401 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61237.log.gz 2026-03-24T12:09:13.401 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91800.log 2026-03-24T12:09:13.401 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28309.log 2026-03-24T12:09:13.402 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82097.log.gz 2026-03-24T12:09:13.402 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91800.log.gz 2026-03-24T12:09:13.402 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91106.log 2026-03-24T12:09:13.402 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28309.log.gz 2026-03-24T12:09:13.402 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40381.log 2026-03-24T12:09:13.403 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91106.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26254.log 2026-03-24T12:09:13.403 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91106.log.gz 2026-03-24T12:09:13.403 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40381.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37096.log 2026-03-24T12:09:13.403 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40381.log.gz 2026-03-24T12:09:13.403 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26254.log.gz 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81933.log 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40782.log 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37096.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37096.log.gz 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81933.log.gz 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41470.log 2026-03-24T12:09:13.404 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40782.log.gz 2026-03-24T12:09:13.405 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43433.log 2026-03-24T12:09:13.405 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41470.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73886.log 2026-03-24T12:09:13.405 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41470.log.gz 2026-03-24T12:09:13.405 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43433.log.gz 2026-03-24T12:09:13.405 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64850.log 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73886.log.gz 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80294.log 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25508.log 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64850.log.gz 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80294.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80294.log.gz 2026-03-24T12:09:13.406 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51165.log 2026-03-24T12:09:13.407 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68436.log 2026-03-24T12:09:13.407 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25508.log.gz 2026-03-24T12:09:13.407 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51165.log.gz 2026-03-24T12:09:13.407 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54317.log 2026-03-24T12:09:13.407 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25840.log 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68436.log: /var/log/ceph/ceph-client.admin.54317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54317.log.gz 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr: 13.6% -- replaced with /var/log/ceph/ceph-client.admin.68436.log.gz 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67583.log 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73327.log 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25840.log.gz 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67583.log.gz 2026-03-24T12:09:13.408 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52793.log 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61087.log 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73327.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73327.log.gz 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52793.log.gz 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30578.log 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72388.log 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61087.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61087.log.gz 2026-03-24T12:09:13.409 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30578.log.gz 2026-03-24T12:09:13.410 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80337.log 2026-03-24T12:09:13.410 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53309.log 2026-03-24T12:09:13.410 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72388.log: /var/log/ceph/ceph-client.admin.80337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80337.log.gz 2026-03-24T12:09:13.410 INFO:teuthology.orchestra.run.vm05.stderr: 58.6% -- replaced with /var/log/ceph/ceph-client.admin.72388.log.gz 2026-03-24T12:09:13.410 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83575.log 2026-03-24T12:09:13.411 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72623.log 2026-03-24T12:09:13.411 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53309.log.gz 2026-03-24T12:09:13.411 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83575.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83575.log.gz 2026-03-24T12:09:13.411 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28665.log 2026-03-24T12:09:13.411 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77680.log 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72623.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72623.log.gz 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28665.log.gz 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65722.log 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37639.log 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77680.log.gz 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65722.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65722.log.gz 2026-03-24T12:09:13.412 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74133.log 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30219.log 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37639.log.gz 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74133.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74133.log.gz 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35312.log 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84087.log 2026-03-24T12:09:13.413 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30219.log.gz 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35312.log.gz 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65829.log 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75580.log 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84087.log: /var/log/ceph/ceph-client.admin.65829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84087.log.gz 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65829.log.gz 2026-03-24T12:09:13.414 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91826.log 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53396.log 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75580.log: /var/log/ceph/ceph-client.admin.91826.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75580.log.gz 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91826.log.gz 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36620.log 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47512.log 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53396.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53396.log.gz 2026-03-24T12:09:13.415 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36620.log.gz 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59003.log 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76062.log 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47512.log: /var/log/ceph/ceph-client.admin.59003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47512.log.gz 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59003.log.gz 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38003.log 2026-03-24T12:09:13.416 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83936.log 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76062.log.gz 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38003.log: 25.6% -- replaced with /var/log/ceph/ceph-client.admin.38003.log.gz 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54050.log 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55459.log 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83936.log.gz 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.54050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54050.log.gz 2026-03-24T12:09:13.417 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84715.log 2026-03-24T12:09:13.418 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51766.log 2026-03-24T12:09:13.418 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55459.log: /var/log/ceph/ceph-client.admin.84715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55459.log.gz 2026-03-24T12:09:13.418 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84715.log.gz 2026-03-24T12:09:13.418 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64737.log 2026-03-24T12:09:13.418 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69448.log 2026-03-24T12:09:13.419 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64737.log: /var/log/ceph/ceph-client.admin.51766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64737.log.gz 2026-03-24T12:09:13.419 INFO:teuthology.orchestra.run.vm05.stderr: 54.9% -- replaced with /var/log/ceph/ceph-client.admin.51766.log.gz 2026-03-24T12:09:13.419 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46526.log 2026-03-24T12:09:13.419 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34792.log 2026-03-24T12:09:13.419 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69448.log.gz 2026-03-24T12:09:13.420 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46526.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.46526.log.gz 2026-03-24T12:09:13.420 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26873.log 2026-03-24T12:09:13.420 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34792.log.gz 2026-03-24T12:09:13.420 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58756.log 2026-03-24T12:09:13.420 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65464.log 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26873.log.gz 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58756.log.gz 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50189.log 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30047.log 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65464.log.gz 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50189.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62868.log 2026-03-24T12:09:13.421 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.50189.log.gz 2026-03-24T12:09:13.422 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30047.log.gz 2026-03-24T12:09:13.422 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59243.log 2026-03-24T12:09:13.422 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87612.log 2026-03-24T12:09:13.422 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62868.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62868.log.gz 2026-03-24T12:09:13.422 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59243.log.gz 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67862.log 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80103.log 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87612.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87612.log.gz 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67862.log.gz 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80179.log 2026-03-24T12:09:13.423 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67518.log 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80103.log.gz 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80179.log.gz 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87999.log 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45525.log 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67518.log.gz 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87999.log.gz 2026-03-24T12:09:13.424 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73801.log 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91051.log 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45525.log.gz 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73801.log.gz 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65507.log 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61044.log 2026-03-24T12:09:13.425 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91051.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91051.log.gz 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65507.log.gz 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41719.log 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41581.log 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61044.log.gz 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41719.log: 58.4% -- replaced with /var/log/ceph/ceph-client.admin.41719.log.gz 2026-03-24T12:09:13.426 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58606.log 2026-03-24T12:09:13.427 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41581.log: 29.4% -- replaced with /var/log/ceph/ceph-client.admin.41581.log.gz 2026-03-24T12:09:13.427 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52965.log 2026-03-24T12:09:13.427 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35634.log 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58606.log.gz 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52965.log.gz 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50950.log 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38896.log 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35634.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35634.log.gz 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50950.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50950.log.gz 2026-03-24T12:09:13.428 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41319.log 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49717.log 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38896.log: /var/log/ceph/ceph-client.admin.41319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41319.log.gz 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.38896.log.gz 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45718.log 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34532.log 2026-03-24T12:09:13.429 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49717.log.gz 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45718.log.gz 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55157.log 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36062.log 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34532.log.gz 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55157.log.gz 2026-03-24T12:09:13.430 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48963.log 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85059.log 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36062.log: /var/log/ceph/ceph-client.admin.48963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36062.log.gz 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr:gzip 55.9% -5 --verbose -- replaced with /var/log/ceph/ceph-client.admin.48963.log.gz -- 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr: /var/log/ceph/ceph-client.admin.39618.log 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85059.log.gz 2026-03-24T12:09:13.431 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68538.log 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73973.log 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39618.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39618.log.gz 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68538.log.gz 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80028.log 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72329.log 2026-03-24T12:09:13.432 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73973.log.gz 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80028.log.gz 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64890.log 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47254.log 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72329.log: /var/log/ceph/ceph-client.admin.64890.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64890.log.gz 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr: 52.5% -- replaced with /var/log/ceph/ceph-client.admin.72329.log.gz 2026-03-24T12:09:13.433 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54440.log 2026-03-24T12:09:13.434 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82054.log 2026-03-24T12:09:13.434 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47254.log: /var/log/ceph/ceph-client.admin.54440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47254.log.gz 2026-03-24T12:09:13.434 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54440.log.gz 2026-03-24T12:09:13.434 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77118.log 2026-03-24T12:09:13.434 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45869.log 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82054.log.gz 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77118.log.gz 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58241.log 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77139.log 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45869.log: /var/log/ceph/ceph-client.admin.58241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45869.log.gz 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58241.log.gz 2026-03-24T12:09:13.435 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73585.log 2026-03-24T12:09:13.436 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63737.log 2026-03-24T12:09:13.436 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77139.log: /var/log/ceph/ceph-client.admin.73585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77139.log.gz 2026-03-24T12:09:13.436 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73585.log.gz 2026-03-24T12:09:13.436 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60850.log 2026-03-24T12:09:13.436 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88236.log 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63737.log.gz 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60850.log.gz 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35410.log 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29274.log 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88236.log: /var/log/ceph/ceph-client.admin.35410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88236.log.gz 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35410.log.gz 2026-03-24T12:09:13.437 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79716.log 2026-03-24T12:09:13.438 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66130.log 2026-03-24T12:09:13.438 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29274.log: /var/log/ceph/ceph-client.admin.79716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29274.log.gz 2026-03-24T12:09:13.438 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79716.log.gz 2026-03-24T12:09:13.438 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87480.log 2026-03-24T12:09:13.438 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76449.log 2026-03-24T12:09:13.439 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87480.log.gz 2026-03-24T12:09:13.439 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66130.log.gz 2026-03-24T12:09:13.439 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63777.log 2026-03-24T12:09:13.439 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63088.log 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76449.log: /var/log/ceph/ceph-client.admin.63777.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76449.log.gz 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr: 53.0% -- replaced with /var/log/ceph/ceph-client.admin.63777.log.gz 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60936.log 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58520.log 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63088.log: /var/log/ceph/ceph-client.admin.60936.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63088.log.gz 2026-03-24T12:09:13.440 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60936.log.gz 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84887.log 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84302.log 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58520.log: /var/log/ceph/ceph-client.admin.84887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58520.log.gz 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84887.log.gz 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87548.log 2026-03-24T12:09:13.441 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89277.log 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84302.log: /var/log/ceph/ceph-client.admin.87548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84302.log.gz 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87548.log.gz 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55007.log 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.88676.log 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89277.log: /var/log/ceph/ceph-client.admin.55007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89277.log.gz 2026-03-24T12:09:13.442 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55007.log.gz 2026-03-24T12:09:13.443 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40760.log 2026-03-24T12:09:13.443 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.88676.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.88676.log.gz 2026-03-24T12:09:13.443 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75804.log 2026-03-24T12:09:13.443 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46176.log 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40760.log: /var/log/ceph/ceph-client.admin.75804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75804.log.gz 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.40760.log.gz 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74531.log 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75338.log 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46176.log: /var/log/ceph/ceph-client.admin.74531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46176.log.gz 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74531.log.gz 2026-03-24T12:09:13.444 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42857.log 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58198.log 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75338.log.gz 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42857.log.gz 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38171.log 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89253.log 2026-03-24T12:09:13.445 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58198.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58198.log.gz 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38171.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.38171.log.gz 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53969.log 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74193.log 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89253.log.gz 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53969.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53969.log.gz 2026-03-24T12:09:13.446 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77877.log 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91311.log 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74193.log.gz 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77877.log.gz 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79471.log 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56689.log 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91311.log.gz 2026-03-24T12:09:13.447 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79471.log.gz 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71972.log 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46351.log 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56689.log: /var/log/ceph/ceph-client.admin.71972.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56689.log.gz 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71972.log.gz 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28781.log 2026-03-24T12:09:13.448 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48602.log 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46351.log.gz 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28781.log.gz 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34372.log 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61452.log 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48602.log.gz 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34372.log.gz 2026-03-24T12:09:13.449 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57486.log 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80853.log 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.61452.log: /var/log/ceph/ceph-client.admin.57486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61452.log.gz 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57486.log.gz 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81024.log 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64773.log 2026-03-24T12:09:13.450 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80853.log.gz 2026-03-24T12:09:13.451 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39253.log 2026-03-24T12:09:13.451 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81024.log.gz 2026-03-24T12:09:13.451 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64773.log.gz 2026-03-24T12:09:13.451 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69556.log 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66938.log 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39253.log: /var/log/ceph/ceph-client.admin.69556.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.39253.log.gz 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69556.log.gz 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86943.log 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55782.log 2026-03-24T12:09:13.452 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66938.log: /var/log/ceph/ceph-client.admin.86943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66938.log.gz 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86943.log.gz 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85209.log 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65851.log 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55782.log.gz 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85209.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85209.log.gz 2026-03-24T12:09:13.453 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58069.log 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58112.log 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.65851.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65851.log.gz 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58069.log.gz 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82254.log 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72765.log 2026-03-24T12:09:13.454 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58112.log: /var/log/ceph/ceph-client.admin.82254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82254.log.gz 2026-03-24T12:09:13.455 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58112.log.gz 2026-03-24T12:09:13.455 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90338.log 2026-03-24T12:09:13.455 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31472.log 2026-03-24T12:09:13.455 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72765.log.gz 2026-03-24T12:09:13.455 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90338.log.gz 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69149.log 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35427.log 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31472.log.gz 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69149.log.gz 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39211.log 2026-03-24T12:09:13.456 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62613.log 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35427.log.gz 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39211.log: 25.8% -- replaced with /var/log/ceph/ceph-client.admin.39211.log.gz 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36266.log 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64599.log 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62613.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62613.log.gz 2026-03-24T12:09:13.457 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36266.log.gz 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78178.log 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29423.log 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64599.log: /var/log/ceph/ceph-client.admin.78178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78178.log.gz 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83786.log 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr: 54.8% -- replaced with /var/log/ceph/ceph-client.admin.64599.log.gz 2026-03-24T12:09:13.458 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75116.log 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83786.log.gz 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29423.log.gz 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68789.log 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79008.log 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75116.log.gz 2026-03-24T12:09:13.459 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68789.log: 11.1% -- replaced with /var/log/ceph/ceph-client.admin.68789.log.gz 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57012.log 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72430.log 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79008.log: /var/log/ceph/ceph-client.admin.57012.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79008.log.gz 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57012.log.gz 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57628.log 2026-03-24T12:09:13.460 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81199.log 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72430.log: /var/log/ceph/ceph-client.admin.57628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72430.log.gz 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57628.log.gz 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30956.log 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79960.log 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81199.log: /var/log/ceph/ceph-client.admin.30956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81199.log.gz 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30956.log.gz 2026-03-24T12:09:13.461 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81629.log 2026-03-24T12:09:13.462 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64304.log 2026-03-24T12:09:13.462 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79960.log: /var/log/ceph/ceph-client.admin.81629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79960.log.gz 2026-03-24T12:09:13.462 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81629.log.gz 2026-03-24T12:09:13.462 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44192.log 2026-03-24T12:09:13.462 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64304.log.gz 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85801.log 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44192.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35719.log 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr: 31.8% -- replaced with /var/log/ceph/ceph-client.admin.44192.log.gz 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85801.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37681.log 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85801.log.gz 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35719.log.gz 2026-03-24T12:09:13.463 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56668.log 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52621.log 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37681.log: /var/log/ceph/ceph-client.admin.56668.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37681.log.gz 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56668.log.gz 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28577.log 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53637.log 2026-03-24T12:09:13.464 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52621.log.gz 2026-03-24T12:09:13.465 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28577.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41233.log 2026-03-24T12:09:13.465 INFO:teuthology.orchestra.run.vm05.stderr: 5.3% -- replaced with /var/log/ceph/ceph-client.admin.28577.log.gz 2026-03-24T12:09:13.465 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53637.log.gz 2026-03-24T12:09:13.465 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86857.log 2026-03-24T12:09:13.466 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41233.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46508.log 2026-03-24T12:09:13.466 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41233.log.gz 2026-03-24T12:09:13.466 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86857.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75298.log 2026-03-24T12:09:13.466 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86857.log.gz 2026-03-24T12:09:13.466 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46508.log.gz 2026-03-24T12:09:13.467 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56055.log 2026-03-24T12:09:13.467 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75298.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41190.log 2026-03-24T12:09:13.467 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75298.log.gz 2026-03-24T12:09:13.467 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56055.log.gz 2026-03-24T12:09:13.467 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30460.log 2026-03-24T12:09:13.468 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33852.log 2026-03-24T12:09:13.468 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41190.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41190.log.gz 2026-03-24T12:09:13.468 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30460.log.gz 2026-03-24T12:09:13.468 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89301.log 2026-03-24T12:09:13.468 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34292.log 2026-03-24T12:09:13.469 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33852.log.gz 2026-03-24T12:09:13.469 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89301.log.gz 2026-03-24T12:09:13.469 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37366.log 2026-03-24T12:09:13.469 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46333.log 2026-03-24T12:09:13.469 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.34292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34292.log.gz 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37366.log: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37366.log.gz 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74668.log 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69792.log 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46333.log.gz 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74668.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74668.log.gz 2026-03-24T12:09:13.470 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26684.log 2026-03-24T12:09:13.471 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.69792.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69792.log.gz 2026-03-24T12:09:13.471 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60764.log 2026-03-24T12:09:13.471 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85038.log 2026-03-24T12:09:13.471 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26684.log.gz 2026-03-24T12:09:13.471 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60764.log.gz 2026-03-24T12:09:13.472 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55674.log 2026-03-24T12:09:13.472 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85038.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87655.log 2026-03-24T12:09:13.472 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85038.log.gz 2026-03-24T12:09:13.472 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55674.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55674.log.gz 2026-03-24T12:09:13.472 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39941.log 2026-03-24T12:09:13.473 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26060.log 2026-03-24T12:09:13.473 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87655.log.gz 2026-03-24T12:09:13.473 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39941.log.gz 2026-03-24T12:09:13.473 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89453.log 2026-03-24T12:09:13.473 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39919.log 2026-03-24T12:09:13.474 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26060.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26060.log.gz 2026-03-24T12:09:13.474 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89453.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89453.log.gz 2026-03-24T12:09:13.474 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75597.log 2026-03-24T12:09:13.474 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.81113.log 2026-03-24T12:09:13.475 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75597.log.gz 2026-03-24T12:09:13.475 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39919.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39919.log.gz 2026-03-24T12:09:13.476 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84345.log 2026-03-24T12:09:13.476 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.81113.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32512.log 2026-03-24T12:09:13.476 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.81113.log.gz 2026-03-24T12:09:13.476 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84345.log.gz 2026-03-24T12:09:13.476 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65872.log 2026-03-24T12:09:13.477 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91670.log 2026-03-24T12:09:13.477 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32512.log: /var/log/ceph/ceph-client.admin.65872.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32512.log.gz 2026-03-24T12:09:13.477 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65872.log.gz 2026-03-24T12:09:13.477 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30479.log 2026-03-24T12:09:13.477 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26619.log 2026-03-24T12:09:13.478 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91670.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91670.log.gz 2026-03-24T12:09:13.478 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30479.log.gz 2026-03-24T12:09:13.478 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44133.log 2026-03-24T12:09:13.478 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45178.log 2026-03-24T12:09:13.478 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26619.log.gz 2026-03-24T12:09:13.479 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28287.log 2026-03-24T12:09:13.479 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44133.log: /var/log/ceph/ceph-client.admin.45178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44133.log.gz 2026-03-24T12:09:13.479 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45178.log.gz 2026-03-24T12:09:13.479 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63654.log 2026-03-24T12:09:13.479 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28287.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66314.log 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28287.log.gz 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63654.log.gz 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84479.log 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64404.log 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66314.log.gz 2026-03-24T12:09:13.480 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84479.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84479.log.gz 2026-03-24T12:09:13.481 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74998.log 2026-03-24T12:09:13.481 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71760.log 2026-03-24T12:09:13.481 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64404.log.gz 2026-03-24T12:09:13.481 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74998.log.gz 2026-03-24T12:09:13.481 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78350.log 2026-03-24T12:09:13.482 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36351.log 2026-03-24T12:09:13.482 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71760.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71760.log.gz 2026-03-24T12:09:13.482 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78350.log.gz 2026-03-24T12:09:13.482 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53843.log 2026-03-24T12:09:13.482 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47619.log 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36351.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36351.log.gz 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53843.log.gz 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59085.log 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33812.log 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47619.log.gz 2026-03-24T12:09:13.483 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59085.log.gz 2026-03-24T12:09:13.484 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40069.log 2026-03-24T12:09:13.484 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37555.log 2026-03-24T12:09:13.484 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33812.log.gz 2026-03-24T12:09:13.484 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40069.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40069.log.gz 2026-03-24T12:09:13.484 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46687.log 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35092.log 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37555.log: /var/log/ceph/ceph-client.admin.46687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46687.log.gz 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.37555.log.gz 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72244.log 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43313.log 2026-03-24T12:09:13.485 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35092.log.gz 2026-03-24T12:09:13.486 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72244.log: 84.3% -- replaced with /var/log/ceph/ceph-client.admin.72244.log.gz 2026-03-24T12:09:13.486 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90166.log 2026-03-24T12:09:13.486 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43313.log.gz 2026-03-24T12:09:13.487 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83058.log 2026-03-24T12:09:13.487 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36334.log 2026-03-24T12:09:13.487 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90166.log.gz 2026-03-24T12:09:13.487 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83058.log.gz 2026-03-24T12:09:13.487 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84672.log 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71803.log 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36334.log.gz 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84672.log.gz 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75752.log 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91774.log 2026-03-24T12:09:13.488 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71803.log.gz 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75752.log.gz 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63574.log 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82736.log 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91774.log.gz 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63574.log.gz 2026-03-24T12:09:13.489 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43491.log 2026-03-24T12:09:13.490 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63244.log 2026-03-24T12:09:13.490 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82736.log.gz 2026-03-24T12:09:13.490 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43491.log.gz 2026-03-24T12:09:13.490 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60313.log 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30283.log 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.63244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63244.log.gz 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60313.log.gz 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73564.log 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79270.log 2026-03-24T12:09:13.491 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30283.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30283.log.gz 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73564.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73564.log.gz 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86771.log 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87354.log 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.79270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79270.log.gz 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86771.log.gz 2026-03-24T12:09:13.492 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73435.log 2026-03-24T12:09:13.493 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41147.log 2026-03-24T12:09:13.493 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87354.log.gz 2026-03-24T12:09:13.493 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73435.log.gz 2026-03-24T12:09:13.493 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26769.log 2026-03-24T12:09:13.493 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27688.log 2026-03-24T12:09:13.494 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41147.log.gz 2026-03-24T12:09:13.494 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26769.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26769.log.gz 2026-03-24T12:09:13.494 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47762.log 2026-03-24T12:09:13.494 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28684.log 2026-03-24T12:09:13.494 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27688.log.gz 2026-03-24T12:09:13.495 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71352.log 2026-03-24T12:09:13.495 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.47762.log.gz 2026-03-24T12:09:13.495 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57937.log 2026-03-24T12:09:13.495 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.28684.log.gz 2026-03-24T12:09:13.496 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71352.log.gz 2026-03-24T12:09:13.496 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86728.log 2026-03-24T12:09:13.496 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84259.log 2026-03-24T12:09:13.496 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57937.log: /var/log/ceph/ceph-client.admin.86728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86728.log.gz 2026-03-24T12:09:13.496 INFO:teuthology.orchestra.run.vm05.stderr: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.57937.log.gz 2026-03-24T12:09:13.497 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85844.log 2026-03-24T12:09:13.497 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84259.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39898.log 2026-03-24T12:09:13.497 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84259.log.gz 2026-03-24T12:09:13.497 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85844.log.gz 2026-03-24T12:09:13.497 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27224.log 2026-03-24T12:09:13.498 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82326.log 2026-03-24T12:09:13.498 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39898.log.gz 2026-03-24T12:09:13.498 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27224.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27224.log.gz 2026-03-24T12:09:13.498 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38318.log 2026-03-24T12:09:13.498 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53990.log 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82326.log: /var/log/ceph/ceph-client.admin.38318.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82326.log.gz 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38318.log.gz 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61388.log 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47125.log 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.53990.log: /var/log/ceph/ceph-client.admin.61388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53990.log.gz 2026-03-24T12:09:13.499 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61388.log.gz 2026-03-24T12:09:13.500 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69577.log 2026-03-24T12:09:13.500 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30999.log 2026-03-24T12:09:13.500 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47125.log: /var/log/ceph/ceph-client.admin.69577.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47125.log.gz 2026-03-24T12:09:13.500 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69577.log.gz 2026-03-24T12:09:13.500 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45329.log 2026-03-24T12:09:13.501 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36773.log 2026-03-24T12:09:13.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30999.log.gz 2026-03-24T12:09:13.501 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45329.log.gz 2026-03-24T12:09:13.501 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66790.log 2026-03-24T12:09:13.502 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67690.log 2026-03-24T12:09:13.502 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36773.log.gz 2026-03-24T12:09:13.502 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66790.log.gz 2026-03-24T12:09:13.502 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57632.log 2026-03-24T12:09:13.503 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43043.log 2026-03-24T12:09:13.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67690.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67690.log.gz 2026-03-24T12:09:13.503 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57632.log.gz 2026-03-24T12:09:13.503 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43146.log 2026-03-24T12:09:13.503 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27011.log 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43043.log: /var/log/ceph/ceph-client.admin.43146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43043.log.gz 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.43146.log.gz 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.79786.log 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71459.log 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27011.log: /var/log/ceph/ceph-client.admin.79786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27011.log.gz 2026-03-24T12:09:13.504 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.79786.log.gz 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43710.log 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32699.log 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71459.log: /var/log/ceph/ceph-client.admin.43710.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71459.log.gz 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43710.log.gz 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73155.log 2026-03-24T12:09:13.505 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86384.log 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32699.log: /var/log/ceph/ceph-client.admin.73155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73155.log.gz 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr: 2.5% -- replaced with /var/log/ceph/ceph-client.admin.32699.log.gz 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47700.log 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47448.log 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86384.log: /var/log/ceph/ceph-client.admin.47700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86384.log.gz 2026-03-24T12:09:13.506 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47700.log.gz 2026-03-24T12:09:13.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31407.log 2026-03-24T12:09:13.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57583.log 2026-03-24T12:09:13.507 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47448.log: /var/log/ceph/ceph-client.admin.31407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47448.log.gz 2026-03-24T12:09:13.507 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31407.log.gz 2026-03-24T12:09:13.507 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50800.log 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77161.log 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57583.log: /var/log/ceph/ceph-client.admin.50800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57583.log.gz 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50800.log.gz 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36688.log 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83850.log 2026-03-24T12:09:13.508 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77161.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77161.log.gz 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.36688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36688.log.gz 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82822.log 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51251.log 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83850.log.gz 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82822.log.gz 2026-03-24T12:09:13.509 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75716.log 2026-03-24T12:09:13.510 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71180.log 2026-03-24T12:09:13.510 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.51251.log: /var/log/ceph/ceph-client.admin.75716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51251.log.gz 2026-03-24T12:09:13.510 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75716.log.gz 2026-03-24T12:09:13.510 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47104.log 2026-03-24T12:09:13.511 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67755.log 2026-03-24T12:09:13.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47104.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47104.log.gz 2026-03-24T12:09:13.511 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71180.log.gz 2026-03-24T12:09:13.511 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41843.log 2026-03-24T12:09:13.511 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64130.log 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67755.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67755.log.gz 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41843.log.gz 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31278.log 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.86231.log 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64130.log.gz 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31278.log.gz 2026-03-24T12:09:13.512 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.83594.log 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66582.log 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.86231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.86231.log.gz 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.83594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.83594.log.gz 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49932.log 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.75911.log 2026-03-24T12:09:13.513 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66582.log: /var/log/ceph/ceph-client.admin.49932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49932.log.gz 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.66582.log.gz 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47943.log 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50395.log 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.75911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.75911.log.gz 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47943.log.gz 2026-03-24T12:09:13.514 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45482.log 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31386.log 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50395.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50395.log.gz 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.45482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45482.log.gz 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71846.log 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78787.log 2026-03-24T12:09:13.515 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31386.log: /var/log/ceph/ceph-client.admin.71846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31386.log.gz 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71846.log.gz 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84457.log 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67131.log 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.78787.log: /var/log/ceph/ceph-client.admin.84457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78787.log.gz 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84457.log.gz 2026-03-24T12:09:13.516 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66981.log 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59539.log 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.67131.log: /var/log/ceph/ceph-client.admin.66981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67131.log.gz 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66981.log.gz 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64324.log 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60270.log 2026-03-24T12:09:13.517 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.59539.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59539.log.gz 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64324.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64324.log.gz 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76148.log 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56141.log 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.60270.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60270.log.gz 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76148.log.gz 2026-03-24T12:09:13.518 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.2.log 2026-03-24T12:09:13.519 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52987.log 2026-03-24T12:09:13.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.56141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56141.log.gz 2026-03-24T12:09:13.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29359.log 2026-03-24T12:09:13.519 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.52987.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52987.log.gz 2026-03-24T12:09:13.520 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77555.log 2026-03-24T12:09:13.520 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29359.log.gz 2026-03-24T12:09:13.520 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91592.log 2026-03-24T12:09:13.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.77555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77555.log.gz 2026-03-24T12:09:13.521 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.76817.log 2026-03-24T12:09:13.521 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91592.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91592.log.gz 2026-03-24T12:09:13.521 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32307.log 2026-03-24T12:09:13.522 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.76817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.76817.log.gz 2026-03-24T12:09:13.522 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42185.log 2026-03-24T12:09:13.522 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32307.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32307.log.gz 2026-03-24T12:09:13.523 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29595.log 2026-03-24T12:09:13.523 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42185.log: 26.0% -- replaced with /var/log/ceph/ceph-client.admin.42185.log.gz 2026-03-24T12:09:13.523 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27300.log 2026-03-24T12:09:13.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29595.log.gz 2026-03-24T12:09:13.524 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38066.log 2026-03-24T12:09:13.524 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27300.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27300.log.gz 2026-03-24T12:09:13.524 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41884.log 2026-03-24T12:09:13.525 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38066.log: 26.7% -- replaced with /var/log/ceph/ceph-client.admin.38066.log.gz 2026-03-24T12:09:13.525 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71932.log 2026-03-24T12:09:13.525 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41884.log: 52.9% -- replaced with /var/log/ceph/ceph-client.admin.41884.log.gz 2026-03-24T12:09:13.526 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41761.log 2026-03-24T12:09:13.526 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.71932.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71932.log.gz 2026-03-24T12:09:13.526 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.85736.log 2026-03-24T12:09:13.527 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41761.log: 59.2% -- replaced with /var/log/ceph/ceph-client.admin.41761.log.gz 2026-03-24T12:09:13.527 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28721.log 2026-03-24T12:09:13.527 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.85736.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.85736.log.gz 2026-03-24T12:09:13.527 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30305.log 2026-03-24T12:09:13.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28721.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28721.log.gz 2026-03-24T12:09:13.528 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.84216.log 2026-03-24T12:09:13.528 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30305.log.gz 2026-03-24T12:09:13.529 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38570.log 2026-03-24T12:09:13.529 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.84216.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.84216.log.gz 2026-03-24T12:09:13.529 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33532.log 2026-03-24T12:09:13.529 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38570.log: 26.8% -- replaced with /var/log/ceph/ceph-client.admin.38570.log.gz 2026-03-24T12:09:13.530 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33412.log 2026-03-24T12:09:13.530 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33532.log.gz 2026-03-24T12:09:13.530 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27376.log 2026-03-24T12:09:13.531 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.33412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33412.log.gz 2026-03-24T12:09:13.531 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28416.log 2026-03-24T12:09:13.531 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27376.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.27376.log.gz 2026-03-24T12:09:13.531 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.72748.log 2026-03-24T12:09:13.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28416.log.gz 2026-03-24T12:09:13.532 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74173.log 2026-03-24T12:09:13.532 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.72748.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.72748.log.gz 2026-03-24T12:09:13.533 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66725.log 2026-03-24T12:09:13.533 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74173.log.gz 2026-03-24T12:09:13.533 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39511.log 2026-03-24T12:09:13.533 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.66725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66725.log.gz 2026-03-24T12:09:13.534 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25788.log 2026-03-24T12:09:13.534 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39511.log.gz 2026-03-24T12:09:13.534 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62517.log 2026-03-24T12:09:13.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.25788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25788.log.gz 2026-03-24T12:09:13.535 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48521.log 2026-03-24T12:09:13.535 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62517.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62517.log.gz 2026-03-24T12:09:13.535 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28352.log 2026-03-24T12:09:13.536 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48521.log.gz 2026-03-24T12:09:13.536 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62460.log 2026-03-24T12:09:13.536 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28352.log.gz 2026-03-24T12:09:13.537 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35444.log 2026-03-24T12:09:13.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62460.log.gz 2026-03-24T12:09:13.537 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.74607.log 2026-03-24T12:09:13.537 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.35444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35444.log.gz 2026-03-24T12:09:13.538 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.80595.log 2026-03-24T12:09:13.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.74607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.74607.log.gz 2026-03-24T12:09:13.538 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32222.log 2026-03-24T12:09:13.538 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.80595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.80595.log.gz 2026-03-24T12:09:13.539 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48501.log 2026-03-24T12:09:13.539 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32222.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32222.log.gz 2026-03-24T12:09:13.539 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64640.log 2026-03-24T12:09:13.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.48501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48501.log.gz 2026-03-24T12:09:13.540 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40477.log 2026-03-24T12:09:13.540 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.64640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64640.log.gz 2026-03-24T12:09:13.540 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26447.log 2026-03-24T12:09:13.541 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.40477.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40477.log.gz 2026-03-24T12:09:13.541 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37324.log 2026-03-24T12:09:13.541 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.26447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26447.log.gz 2026-03-24T12:09:13.541 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27584.log 2026-03-24T12:09:13.542 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.37324.log: 25.5% -- replaced with /var/log/ceph/ceph-client.admin.37324.log.gz 2026-03-24T12:09:13.542 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32239.log 2026-03-24T12:09:13.543 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.27584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27584.log.gz 2026-03-24T12:09:13.543 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29789.log 2026-03-24T12:09:13.543 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32239.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32239.log.gz 2026-03-24T12:09:13.543 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.a.log 2026-03-24T12:09:13.544 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29789.log.gz 2026-03-24T12:09:13.544 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58790.log 2026-03-24T12:09:13.551 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-mon.a.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57055.log 2026-03-24T12:09:13.559 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.58790.log: 29.3% -- replaced with /var/log/ceph/ceph-client.admin.58790.log.gz 2026-03-24T12:09:13.563 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90252.log 2026-03-24T12:09:13.563 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.57055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57055.log.gz 2026-03-24T12:09:13.579 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43186.log 2026-03-24T12:09:13.579 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90252.log.gz 2026-03-24T12:09:13.603 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.87698.log 2026-03-24T12:09:13.603 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.43186.log: 58.2% -- replaced with /var/log/ceph/ceph-client.admin.43186.log.gz 2026-03-24T12:09:13.611 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38633.log 2026-03-24T12:09:13.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.87698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.87698.log.gz 2026-03-24T12:09:13.619 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47639.log 2026-03-24T12:09:13.627 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.38633.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42143.log 2026-03-24T12:09:13.627 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.47639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47639.log.gz 2026-03-24T12:09:13.630 INFO:teuthology.orchestra.run.vm05.stderr: 26.4% -- replaced with /var/log/ceph/ceph-client.admin.38633.log.gz 2026-03-24T12:09:13.643 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32921.log 2026-03-24T12:09:13.643 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.42143.log: 26.6% -- replaced with /var/log/ceph/ceph-client.admin.42143.log.gz 2026-03-24T12:09:13.651 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.89929.log 2026-03-24T12:09:13.659 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32921.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46063.log 2026-03-24T12:09:13.659 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.89929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.89929.log.gz 2026-03-24T12:09:13.662 INFO:teuthology.orchestra.run.vm05.stderr: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32921.log.gz 2026-03-24T12:09:13.675 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30197.log 2026-03-24T12:09:13.675 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.46063.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46063.log.gz 2026-03-24T12:09:13.691 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28330.log 2026-03-24T12:09:13.691 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.30197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30197.log.gz 2026-03-24T12:09:13.696 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29896.log 2026-03-24T12:09:13.696 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.28330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28330.log.gz 2026-03-24T12:09:13.703 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.90381.log 2026-03-24T12:09:13.703 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.29896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29896.log.gz 2026-03-24T12:09:13.711 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.91242.log 2026-03-24T12:09:13.711 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.90381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.90381.log.gz 2026-03-24T12:09:13.719 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68357.log 2026-03-24T12:09:13.719 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.91242.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.91242.log.gz 2026-03-24T12:09:13.727 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31514.log 2026-03-24T12:09:13.727 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.68357.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68357.log.gz 2026-03-24T12:09:13.727 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50082.log 2026-03-24T12:09:13.727 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31514.log.gz 2026-03-24T12:09:13.728 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32034.log 2026-03-24T12:09:13.728 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.50082.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50082.log.gz 2026-03-24T12:09:13.728 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41491.log 2026-03-24T12:09:13.728 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.32034.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.32034.log.gz 2026-03-24T12:09:13.729 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55803.log 2026-03-24T12:09:13.729 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41491.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41491.log.gz 2026-03-24T12:09:13.729 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.82115.log 2026-03-24T12:09:13.729 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.55803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55803.log.gz 2026-03-24T12:09:13.730 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49114.log 2026-03-24T12:09:13.730 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.82115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.82115.log.gz 2026-03-24T12:09:13.730 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41255.log 2026-03-24T12:09:13.730 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.49114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49114.log.gz 2026-03-24T12:09:13.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31694.log 2026-03-24T12:09:13.731 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.41255.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41255.log.gz 2026-03-24T12:09:13.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62031.log 2026-03-24T12:09:13.731 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.31694.log: 1.2% -- replaced with /var/log/ceph/ceph-client.admin.31694.log.gz 2026-03-24T12:09:13.731 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44598.log 2026-03-24T12:09:13.732 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.62031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62031.log.gz 2026-03-24T12:09:13.732 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39812.log 2026-03-24T12:09:13.732 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.44598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44598.log.gz 2026-03-24T12:09:13.732 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.73048.log 2026-03-24T12:09:13.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.39812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39812.log.gz 2026-03-24T12:09:13.733 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/ceph-client.admin.73048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.73048.log.gz 2026-03-24T12:09:14.574 INFO:teuthology.orchestra.run.vm05.stderr: 92.0% -- replaced with /var/log/ceph/ceph-mon.a.log.gz 2026-03-24T12:09:20.583 INFO:teuthology.orchestra.run.vm05.stderr: 94.1% -- replaced with /var/log/ceph/ceph-osd.2.log.gz 2026-03-24T12:09:22.823 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/ceph-osd.0.log.gz 2026-03-24T12:09:27.797 INFO:teuthology.orchestra.run.vm05.stderr: 94.1% -- replaced with /var/log/ceph/ceph-osd.1.log.gz 2026-03-24T12:09:27.798 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-24T12:09:27.798 INFO:teuthology.orchestra.run.vm05.stderr:real 0m15.305s 2026-03-24T12:09:27.798 INFO:teuthology.orchestra.run.vm05.stderr:user 0m32.209s 2026-03-24T12:09:27.798 INFO:teuthology.orchestra.run.vm05.stderr:sys 0m2.614s 2026-03-24T12:09:27.798 INFO:tasks.ceph:Archiving logs... 2026-03-24T12:09:27.798 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/log/ceph to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589/remote/vm05/log 2026-03-24T12:09:27.798 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-24T12:09:29.581 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-24T12:09:29.583 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-24T12:09:29.584 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-24T12:09:29.610 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-24T12:09:29.610 DEBUG:teuthology.orchestra.run.vm05:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-24T12:09:29.688 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:29.864 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:29.864 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:29.977 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:29.977 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T12:09:29.977 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-24T12:09:29.977 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:29.988 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:29.988 INFO:teuthology.orchestra.run.vm05.stdout: ceph* 2026-03-24T12:09:30.170 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T12:09:30.170 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-24T12:09:30.226 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126150 files and directories currently installed.) 2026-03-24T12:09:30.229 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:31.256 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:31.295 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:31.473 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:31.473 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:31.589 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:31.589 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T12:09:31.590 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-24T12:09:31.590 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:31.600 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:31.600 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm* cephadm* 2026-03-24T12:09:31.919 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 2 to remove and 44 not upgraded. 2026-03-24T12:09:31.919 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 2177 kB disk space will be freed. 2026-03-24T12:09:31.958 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126148 files and directories currently installed.) 2026-03-24T12:09:31.960 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr-cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:31.973 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-cephadm, directory '/usr/share/ceph/mgr/cephadm/services' not empty so not removed 2026-03-24T12:09:31.984 INFO:teuthology.orchestra.run.vm05.stdout:Removing cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:32.016 INFO:teuthology.orchestra.run.vm05.stdout:Looking for files to backup/remove ... 2026-03-24T12:09:32.018 INFO:teuthology.orchestra.run.vm05.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-24T12:09:32.020 INFO:teuthology.orchestra.run.vm05.stdout:Removing user `cephadm' ... 2026-03-24T12:09:32.020 INFO:teuthology.orchestra.run.vm05.stdout:Warning: group `nogroup' has no more members. 2026-03-24T12:09:32.050 INFO:teuthology.orchestra.run.vm05.stdout:Done. 2026-03-24T12:09:32.074 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:09:32.172 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-24T12:09:32.174 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for cephadm (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:33.190 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:33.228 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:33.403 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:33.403 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:33.513 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:33.513 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-24T12:09:33.513 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-24T12:09:33.513 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:33.523 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:33.523 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds* 2026-03-24T12:09:33.685 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T12:09:33.685 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 6851 kB disk space will be freed. 2026-03-24T12:09:33.724 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-24T12:09:33.726 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:34.143 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:09:34.239 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-24T12:09:34.241 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-mds (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:35.677 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:35.717 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:35.888 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:35.889 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-cheroot python3-cherrypy3 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-psutil 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: python3-threadpoolctl python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev 2026-03-24T12:09:36.009 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:36.019 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:36.019 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-24T12:09:36.019 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-k8sevents* 2026-03-24T12:09:36.191 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 4 to remove and 44 not upgraded. 2026-03-24T12:09:36.192 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 219 MB disk space will be freed. 2026-03-24T12:09:36.233 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-24T12:09:36.235 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr-k8sevents (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:36.247 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr-diskprediction-local (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:36.259 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-diskprediction-local, directory '/usr/share/ceph/mgr/diskprediction_local' not empty so not removed 2026-03-24T12:09:36.270 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr-dashboard (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:36.328 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/services/auth' not empty so not removed 2026-03-24T12:09:36.328 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/plugins' not empty so not removed 2026-03-24T12:09:36.328 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/model' not empty so not removed 2026-03-24T12:09:36.328 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/controllers' not empty so not removed 2026-03-24T12:09:36.328 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/api' not empty so not removed 2026-03-24T12:09:36.338 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:36.806 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124272 files and directories currently installed.) 2026-03-24T12:09:36.808 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-mgr (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:37.212 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr, directory '/var/lib/ceph/mgr' not empty so not removed 2026-03-24T12:09:38.209 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:38.249 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:38.422 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:38.423 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T12:09:38.538 INFO:teuthology.orchestra.run.vm05.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T12:09:38.539 INFO:teuthology.orchestra.run.vm05.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:38.539 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:38.539 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:38.548 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:38.548 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-24T12:09:38.720 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 7 to remove and 44 not upgraded. 2026-03-24T12:09:38.720 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 732 MB disk space will be freed. 2026-03-24T12:09:38.756 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124271 files and directories currently installed.) 2026-03-24T12:09:38.758 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-volume (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:38.824 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:39.243 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:39.648 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:40.084 INFO:teuthology.orchestra.run.vm05.stdout:Removing radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:40.499 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-test (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:40.523 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:40.954 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:09:40.994 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T12:09:41.067 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123780 files and directories currently installed.) 2026-03-24T12:09:41.069 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for radosgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:41.704 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-mon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:42.126 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mon, directory '/var/lib/ceph/mon' not empty so not removed 2026-03-24T12:09:42.136 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-base (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:42.629 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:43.147 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-osd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:43.534 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-osd, directory '/var/lib/ceph/osd' not empty so not removed 2026-03-24T12:09:44.539 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:44.577 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:44.724 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:44.725 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:44.837 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:44.846 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:44.846 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse* 2026-03-24T12:09:45.007 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T12:09:45.007 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 2932 kB disk space will be freed. 2026-03-24T12:09:45.045 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123764 files and directories currently installed.) 2026-03-24T12:09:45.047 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:45.454 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:09:45.552 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-24T12:09:45.554 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for ceph-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:46.970 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:47.006 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:47.186 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:47.186 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:47.311 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:47.328 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:47.328 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:47.364 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:47.525 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:47.526 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:47.640 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:47.658 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:47.658 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:47.693 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:47.865 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:47.865 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout:Package 'radosgw' is not installed, so not removed 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-24T12:09:47.977 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:47.978 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:47.994 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:47.994 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:48.029 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:48.187 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:48.187 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:48.309 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:48.310 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:48.319 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:48.319 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-24T12:09:48.477 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 3 to remove and 44 not upgraded. 2026-03-24T12:09:48.478 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 2086 kB disk space will be freed. 2026-03-24T12:09:48.514 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-24T12:09:48.516 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-cephfs (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:48.530 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-rgw (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:48.542 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-rados (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:49.554 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:49.592 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:49.762 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:49.763 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:49.878 INFO:teuthology.orchestra.run.vm05.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-24T12:09:49.878 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:49.878 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:49.879 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:49.897 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:49.897 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:49.932 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:50.093 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:50.093 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:50.206 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:50.222 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:50.222 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:50.260 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:50.420 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:50.421 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:50.531 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:50.532 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:50.541 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:50.541 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd* 2026-03-24T12:09:50.700 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 1 to remove and 44 not upgraded. 2026-03-24T12:09:50.700 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 1205 kB disk space will be freed. 2026-03-24T12:09:50.741 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123731 files and directories currently installed.) 2026-03-24T12:09:50.743 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-rbd (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:51.820 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:51.855 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:52.037 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:52.037 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:52.151 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:52.151 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:52.151 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:52.152 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:52.160 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:52.160 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-daemon* libcephfs-dev* libcephfs2* 2026-03-24T12:09:52.322 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 3 to remove and 44 not upgraded. 2026-03-24T12:09:52.322 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 2851 kB disk space will be freed. 2026-03-24T12:09:52.360 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123723 files and directories currently installed.) 2026-03-24T12:09:52.362 INFO:teuthology.orchestra.run.vm05.stdout:Removing libcephfs-daemon (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:52.373 INFO:teuthology.orchestra.run.vm05.stdout:Removing libcephfs-dev (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:52.385 INFO:teuthology.orchestra.run.vm05.stdout:Removing libcephfs2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:52.409 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T12:09:53.432 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:53.468 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:53.641 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:53.641 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:53.764 INFO:teuthology.orchestra.run.vm05.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-24T12:09:53.764 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout: socat xmlstarlet 2026-03-24T12:09:53.765 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:53.782 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:53.782 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:53.818 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:53.983 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:53.984 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:54.096 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:54.096 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:54.097 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:54.106 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:54.107 INFO:teuthology.orchestra.run.vm05.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-24T12:09:54.107 INFO:teuthology.orchestra.run.vm05.stdout: qemu-block-extra* rbd-fuse* 2026-03-24T12:09:54.273 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 7 to remove and 44 not upgraded. 2026-03-24T12:09:54.273 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 59.2 MB disk space will be freed. 2026-03-24T12:09:54.314 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123701 files and directories currently installed.) 2026-03-24T12:09:54.316 INFO:teuthology.orchestra.run.vm05.stdout:Removing rbd-fuse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.329 INFO:teuthology.orchestra.run.vm05.stdout:Removing libsqlite3-mod-ceph (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.342 INFO:teuthology.orchestra.run.vm05.stdout:Removing libradosstriper1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.355 INFO:teuthology.orchestra.run.vm05.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-24T12:09:54.766 INFO:teuthology.orchestra.run.vm05.stdout:Removing librbd1 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.779 INFO:teuthology.orchestra.run.vm05.stdout:Removing librgw2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.793 INFO:teuthology.orchestra.run.vm05.stdout:Removing librados2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:54.820 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:09:54.855 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T12:09:54.926 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-24T12:09:54.928 INFO:teuthology.orchestra.run.vm05.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-24T12:09:56.319 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:56.357 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:56.521 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:56.521 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout:Package 'librbd1' is not installed, so not removed 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:56.633 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:56.652 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:56.652 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:56.687 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:56.870 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:56.870 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout:The following packages were automatically installed and are no longer required: 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:56.995 INFO:teuthology.orchestra.run.vm05.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-24T12:09:57.017 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 0 to remove and 44 not upgraded. 2026-03-24T12:09:57.017 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:09:57.019 DEBUG:teuthology.orchestra.run.vm05:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-24T12:09:57.075 DEBUG:teuthology.orchestra.run.vm05:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-24T12:09:57.154 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:09:57.311 INFO:teuthology.orchestra.run.vm05.stdout:Building dependency tree... 2026-03-24T12:09:57.311 INFO:teuthology.orchestra.run.vm05.stdout:Reading state information... 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout:The following packages will be REMOVED: 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-24T12:09:57.429 INFO:teuthology.orchestra.run.vm05.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-24T12:09:57.591 INFO:teuthology.orchestra.run.vm05.stdout:0 upgraded, 0 newly installed, 64 to remove and 44 not upgraded. 2026-03-24T12:09:57.591 INFO:teuthology.orchestra.run.vm05.stdout:After this operation, 96.8 MB disk space will be freed. 2026-03-24T12:09:57.631 INFO:teuthology.orchestra.run.vm05.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-24T12:09:57.633 INFO:teuthology.orchestra.run.vm05.stdout:Removing ceph-mgr-modules-core (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/volumes/fs/operations/versions' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/test_orchestrator' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telemetry' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telegraf' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/status' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/stats/fs' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/snap_schedule/fs' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/selftest' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rgw' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rbd_support' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/prometheus' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/progress' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/pg_autoscaler' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_support' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_perf_query' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/orchestrator' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/nfs' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/mirroring/fs/dir_map' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/localpool' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/iostat' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/insights' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/influx' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/devicehealth' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/crash' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/balancer' not empty so not removed 2026-03-24T12:09:57.652 INFO:teuthology.orchestra.run.vm05.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/alerts' not empty so not removed 2026-03-24T12:09:57.657 INFO:teuthology.orchestra.run.vm05.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-24T12:09:57.668 INFO:teuthology.orchestra.run.vm05.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-24T12:09:57.679 INFO:teuthology.orchestra.run.vm05.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-24T12:09:57.691 INFO:teuthology.orchestra.run.vm05.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-24T12:09:57.741 INFO:teuthology.orchestra.run.vm05.stdout:Removing libcephfs-proxy2 (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:57.752 INFO:teuthology.orchestra.run.vm05.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-24T12:09:57.762 INFO:teuthology.orchestra.run.vm05.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T12:09:57.774 INFO:teuthology.orchestra.run.vm05.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T12:09:57.785 INFO:teuthology.orchestra.run.vm05.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-24T12:09:57.805 INFO:teuthology.orchestra.run.vm05.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-24T12:09:57.818 INFO:teuthology.orchestra.run.vm05.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-24T12:09:57.830 INFO:teuthology.orchestra.run.vm05.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T12:09:57.841 INFO:teuthology.orchestra.run.vm05.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T12:09:57.854 INFO:teuthology.orchestra.run.vm05.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T12:09:57.866 INFO:teuthology.orchestra.run.vm05.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-24T12:09:57.877 INFO:teuthology.orchestra.run.vm05.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-24T12:09:57.889 INFO:teuthology.orchestra.run.vm05.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-24T12:09:57.901 INFO:teuthology.orchestra.run.vm05.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-24T12:09:57.913 INFO:teuthology.orchestra.run.vm05.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-24T12:09:57.924 INFO:teuthology.orchestra.run.vm05.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-24T12:09:57.934 INFO:teuthology.orchestra.run.vm05.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-24T12:09:57.945 INFO:teuthology.orchestra.run.vm05.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-24T12:09:57.956 INFO:teuthology.orchestra.run.vm05.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-24T12:09:57.968 INFO:teuthology.orchestra.run.vm05.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-24T12:09:57.979 INFO:teuthology.orchestra.run.vm05.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-24T12:09:57.987 INFO:teuthology.orchestra.run.vm05.stdout:update-initramfs: deferring update (trigger activated) 2026-03-24T12:09:57.997 INFO:teuthology.orchestra.run.vm05.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-24T12:09:58.013 INFO:teuthology.orchestra.run.vm05.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-24T12:09:58.024 INFO:teuthology.orchestra.run.vm05.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-24T12:09:58.423 INFO:teuthology.orchestra.run.vm05.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-24T12:09:58.438 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-24T12:09:58.498 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-24T12:09:58.762 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-24T12:09:58.815 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-24T12:09:58.867 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-ceph-argparse (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:58.921 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-ceph-common (20.2.0-712-g70f8415b-1jammy) ... 2026-03-24T12:09:58.982 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-24T12:09:59.052 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-24T12:09:59.106 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-24T12:09:59.157 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-24T12:09:59.214 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-24T12:09:59.264 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-24T12:09:59.314 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-24T12:09:59.365 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-24T12:09:59.415 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-24T12:09:59.543 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-24T12:09:59.603 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-24T12:09:59.657 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-24T12:09:59.707 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-24T12:09:59.827 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-24T12:09:59.879 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-24T12:09:59.928 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-24T12:09:59.979 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-rsa (4.8-1) ... 2026-03-24T12:10:00.029 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-24T12:10:00.084 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-24T12:10:00.098 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-24T12:10:00.153 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-24T12:10:00.204 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-24T12:10:00.259 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-24T12:10:00.316 INFO:teuthology.orchestra.run.vm05.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-24T12:10:00.367 INFO:teuthology.orchestra.run.vm05.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-24T12:10:00.387 INFO:teuthology.orchestra.run.vm05.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-24T12:10:00.823 INFO:teuthology.orchestra.run.vm05.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-24T12:10:00.835 INFO:teuthology.orchestra.run.vm05.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-24T12:10:00.875 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-24T12:10:00.888 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-24T12:10:00.944 INFO:teuthology.orchestra.run.vm05.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-24T12:10:00.962 INFO:teuthology.orchestra.run.vm05.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-171-generic 2026-03-24T12:10:05.845 INFO:teuthology.orchestra.run.vm05.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-24T12:10:05.849 DEBUG:teuthology.parallel:result is None 2026-03-24T12:10:05.849 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm05.local 2026-03-24T12:10:05.850 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-24T12:10:05.898 DEBUG:teuthology.orchestra.run.vm05:> sudo apt-get update 2026-03-24T12:10:06.057 INFO:teuthology.orchestra.run.vm05.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-24T12:10:06.057 INFO:teuthology.orchestra.run.vm05.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-24T12:10:06.092 INFO:teuthology.orchestra.run.vm05.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-24T12:10:06.127 INFO:teuthology.orchestra.run.vm05.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-24T12:10:07.155 INFO:teuthology.orchestra.run.vm05.stdout:Reading package lists... 2026-03-24T12:10:07.171 DEBUG:teuthology.parallel:result is None 2026-03-24T12:10:07.171 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-24T12:10:07.173 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-24T12:10:07.173 DEBUG:teuthology.orchestra.run.vm05:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-24T12:10:07.564 INFO:teuthology.orchestra.run.vm05.stdout: remote refid st t when poll reach delay offset jitter 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:============================================================================== 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:+178.215.228.24 189.97.54.122 2 u 58 256 377 21.807 -9.155 2.431 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-172-236-195-26. 233.72.92.146 3 u 65 256 377 23.408 -6.907 2.328 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:+obelix.hetzner. 91.98.156.7 3 u 244 256 377 25.050 -6.757 1.321 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-mail.gunnarhofm 192.53.103.108 2 u 64 256 377 25.046 -6.309 1.358 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:*ntp1.agw1.soe.a .GPS. 1 u 227 256 377 25.011 -8.797 2.942 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-ntp1.rrze.uni-e .DCFp. 1 u 40 256 377 26.109 -6.084 2.060 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-vps-ber1.orlean 127.65.222.189 2 u 74 256 377 28.860 -5.322 1.628 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-141.84.43.75 176.179.21.42 2 u 55 256 377 31.890 -9.684 3.692 2026-03-24T12:10:07.565 INFO:teuthology.orchestra.run.vm05.stdout:-185.125.190.56 194.121.207.249 2 u 210 256 377 30.340 -5.142 2.254 2026-03-24T12:10:07.565 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-24T12:10:07.567 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-24T12:10:07.568 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-24T12:10:07.569 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-24T12:10:07.572 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-24T12:10:07.574 INFO:teuthology.task.internal:Duration was 5099.965860 seconds 2026-03-24T12:10:07.574 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-24T12:10:07.576 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-24T12:10:07.576 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-24T12:10:07.599 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-24T12:10:07.600 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm05.local 2026-03-24T12:10:07.600 DEBUG:teuthology.orchestra.run.vm05:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-24T12:10:07.655 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-24T12:10:07.655 DEBUG:teuthology.orchestra.run.vm05:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-24T12:10:07.720 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-24T12:10:07.720 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-24T12:10:07.770 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-24T12:10:07.770 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-24T12:10:07.770 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0%gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-24T12:10:07.770 INFO:teuthology.orchestra.run.vm05.stderr: -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-24T12:10:07.771 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-24T12:10:07.774 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-24T12:10:07.775 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-24T12:10:07.778 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-24T12:10:07.779 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-24T12:10:07.826 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-24T12:10:07.829 DEBUG:teuthology.orchestra.run.vm05:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-24T12:10:07.875 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = core 2026-03-24T12:10:07.883 DEBUG:teuthology.orchestra.run.vm05:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-24T12:10:07.929 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-24T12:10:07.929 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-24T12:10:07.933 INFO:teuthology.task.internal:Transferring archived files... 2026-03-24T12:10:07.933 DEBUG:teuthology.misc:Transferring archived files from vm05:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-20_22:04:26-rbd-tentacle-none-default-vps/3589/remote/vm05 2026-03-24T12:10:07.933 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-24T12:10:07.979 INFO:teuthology.task.internal:Removing archive directory... 2026-03-24T12:10:07.979 DEBUG:teuthology.orchestra.run.vm05:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-24T12:10:08.026 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-24T12:10:08.124 INFO:teuthology.task.internal:Not uploading archives. 2026-03-24T12:10:08.124 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-24T12:10:08.130 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-24T12:10:08.130 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-24T12:10:08.134 INFO:teuthology.orchestra.run.vm05.stdout: 258207 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 24 12:10 /home/ubuntu/cephtest 2026-03-24T12:10:08.148 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-24T12:10:08.163 INFO:teuthology.run:Summary data: description: rbd/cli_v1/{base/install clusters/{fixed-1} conf/{disable-pool-app} features/format-1 msgr-failures/few objectstore/bluestore-comp-lz4 supported-random-distro$/{ubuntu_latest} workloads/rbd_cli_generic} duration: 5099.965859651566 flavor: default owner: kyr success: true 2026-03-24T12:10:08.164 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-24T12:10:08.239 INFO:teuthology.run:pass